top of page

Search Results

214 items found for ""

  • Install VMware Aria Suite Lifecycle 8.12 Patch 2 before upgrading to 8.14 for 7 crucial reasons

    The product team for VMware Aria Suite Lifecycle has issued a new update, known as "VMware Aria Suite Lifecycle 8.12 Patch 2," addressing crucial issues. We advise customers intending to upgrade from 8.12 to 8.14 to initially apply Patch 2 to their existing 8.12 installation. Following this, proceed with the upgrade to 8.14 for a smooth and uninterrupted upgrade. Release Notes : https://docs.vmware.com/en/VMware-Aria-Suite-Lifecycle/8.12/rn/vmware-aria-suite-lifecycle-812-patch-2-release-notes/index.html Note It's crucial to take snapshots before upgrading. If a failure occurs, rollback, analyze, and proceed with caution, avoiding repeated upgrade attempts on the failed environment What are those 7 Reasons? Solution to address the issue associated with the "Check Online" upgrade method of VMware Aria Suite Lifecycle when configured with a proxy If a proxy is set up in Suite Lifecycle, the "Check Online" upgrade method may encounter a failure, presenting an exception message indicating "No Upgrade Available." The resolution to this problem is implemented through the application of 8.12 Patch 2, which ensures that the requests originating from Suite Lifecycle to the upgrade repository adhere to the configured proxy settings. Updated checksum for the Windows Connector, enabling the successful download of VMware Identity Manager 3.3.7 binaries When attempting to map VMware Identity Manager binaries, a checksum exception arises, as depicted below. This issue stems from a recent modification in the Windows connector build. For users with Suite Lifecycle 8.12 planning to download 3.3.7 binaries, it is crucial to apply Patch 2. This ensures that the updated checksum is integrated into Suite Lifecycle, leading to a successful mapping of the binaries. Fixed the issue related to collecting all necessary logs needed for debugging the upgrade of VMware Aria Suite Lifecycle In the event of an upgrade failure, it is essential to gather all relevant logs for debugging purposes. Upon the installation of Suite Lifecycle 8.12 Patch 2, it guarantees the comprehensive collection of all necessary logs required for an upgrade. Key logs that demand attention during the upgrade process include: /var/log/vmware/capengine/cap-non-lvm-update/installer-* /var/log/vmware/capengine/cap-non-lvm-update/workflow.log /var/log/vmware/cap_am/appliance_update_list_ms.log /var/log/bootstrap/postupdate.log /data/script.log Updated descriptions for the VMware Aria Suite Lifecycle upgrade pre-check to ensure accuracy The descriptions for pre-checks conducted prior to the upgrade were identical. This issue has been rectified in the Suite Lifecycle Patch 2 Improved the upgrade pre-validation report, explicitly displaying the KB link when the root partition check fails. Before initiating an upgrade, one of the pre-checks involves verifying whether there is sufficient space in the / partition. With the implementation of Patch 2, if a warning for this pre-check arises during an upgrade attempt, it will display the relevant KB article outlining steps to address and alleviate the space issue under the / partition Resolved the upgrade problem in VMware Aria Suite Lifecycle when FIPS is deactivated In Suite Lifecycle 8.12 environments where FIPS is disabled, an upgrade may encounter a stall during the post-update phase. To preemptively circumvent this issue, a workaround involves creating a /var/log/bootstrap/fips_disable flag before initiating the upgrade. If the issue is already encountered, the following KB article provides steps to resolve it: https://kb.vmware.com/s/article/95231. The application of Patch 2 addresses and resolves this problem, eliminating the need for these workarounds. Addressed issues related to Operations for Logs scale-out operations An improvement has been implemented in the scale-out process to prevent any failures when new nodes attempt to join the cluster

  • VMware Aria Suite Lifecycle 8.14 PSPACK 2 | Installation | Demo |

    VMware Aria Suite Lifecycle 8.14 Product Support Pack 2 or PSPACK2 as we call it offers support for VMware Aria Operations 8.14.1 and VMware Aria Operations for Logs 8.14.1 Recorded a small demo which explains logs to monitor and the process involved to implement it Logs to monitor /var/log/vrlcm/vmware_vrlcm.log /var/log/vrlcm/bootstrap.log /var/log/vrlcm/patchcli.log Once there is a successful PSPACK implementation you would see the following messages in logs Reference: /var/log/vrlcm/patchcli.log 2023-11-24 13:14:46,865 - __main__ - INFO - Metadata: {"patchInfo":{"name":"VMware Aria Suite Lifecycle, version 8.14.0 Pspack 2","summary":"Cumulative pspack bundle for vRealize Suite Lifecycle Manager","description":"This cumulative pspack bundle provides fixes to issues observed with various VMware Aria Suite Lifecycle components. Refer the associated docUrl for more details.","kbUrl":"https:\/\/docs.vmware.com\/en\/VMware-vRealize-Lifecycle-Manager\/8.14.0\/rn\/vRealize-Lifecycle-Manager-814-Pspack-2.html","eulaFile":"","category":"bugfix","urgency":"critical","releaseType":"pspack","releaseDate":1700808903000,"additionaInfo":{}},"metadataId":"vrlcm-8.14.0-PSPACK2","metadataVersion":"1","patchId":"6813196a-22c8-4425-a527-3e86a4d30502","patchBundleCreationDate":1700808903,"selfPatch":true,"product":{"productId":"vrlcm","productName":"VMware Aria Suite Lifecycle","productVersion":"8.14.0","supportedVersions":["8.14.0"],"productBuild":"10689094","productPatchBuild":"","additionaInfo":{"patchInstructions":"mkdir -p \/data\/tmp-pspack-81402\/10318114; cp -r /tmp/10318114/VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak \/data\/tmp-pspack-81402\/10318114; cd \/data\/tmp-pspack-81402\/10318114; unzip VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak; unzip lcm_PSPACK_artifacts.zip; cp -r \/tmp\/10318114\/lcm_pspack_metadata.json \/data\/tmp-pspack-81402\/10318114; sh pre_pspack_instructions.sh; sh pspack_instructions.sh \/data\/tmp-pspack-81402\/10318114 6813196a-22c8-4425-a527-3e86a4d30502;"},"patchAlreadyApplied":false},"payload":{"productPatchLevel":"PSPACK2","patchPayloadFilename":"VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak","patchPayloadUri":"","patchPayloadSize":395054038,"sha256sum":"041c06b1e11c83b6d8c20ba915866aa2e2fbd7b2eee79cc1df2d284e3ea2dedb","productMinorLevel":null},"patchFileName":"vrlcm-8.14.0-PSPACK2.pspak","patchSize":0,"patchSha256sum":"","patchRunningCounter":2,"patchStatus":"ACTIVE","patchDownloadStatus":null,"extract":false,"patchCounter":"2"} 2023-11-24 13:14:46,879 - __main__ - INFO - Patch File: /tmp/10318114//VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak 2023-11-24 13:14:46,879 - __main__ - INFO - metadata after parsing : 2023-11-24 13:14:46,879 - __main__ - INFO - patch instructions:mkdir -p /data/tmp-pspack-81402/10318114; cp -r /tmp/10318114/VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak /data/tmp-pspack-81402/10318114; cd /data/tmp-pspack-81402/10318114; unzip VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak; unzip lcm_PSPACK_artifacts.zip; cp -r /tmp/10318114/lcm_pspack_metadata.json /data/tmp-pspack-81402/10318114; sh pre_pspack_instructions.sh; sh pspack_instructions.sh /data/tmp-pspack-81402/10318114 6813196a-22c8-4425-a527-3e86a4d30502; 2023-11-24 13:14:46,880 - __main__ - INFO - installing patch ... Archive: VMware-vLCM-Appliance-8.14.0-PSPACK2.pspak extracting: lcm_PSPACK_artifacts.zip Archive: lcm_PSPACK_artifacts.zip creating: os/ inflating: vmlcm-service-gui-8.14.0-SNAPSHOT.jar inflating: vmlcm-service-8.14.0-SNAPSHOT.jar inflating: vmware-service-configuration.jar extracting: blackstone.zip inflating: APUAT-8.5.0.18176777.pak inflating: APUAT-for-8.10.x-8.14.1.22799028.pak inflating: pre_pspack_instructions.sh inflating: pspack_instructions.sh inflating: policy.json inflating: post_patch_instructions.sh inflating: dev-build-upgrade.sh inflating: populate.sh inflating: patchcli.py inflating: patchcliproxy inflating: vlcm-support inflating: vrlcm-server.service 2023-11-24_13:14:54 Pre-Product Support Pack - vRSLCM 2023-11-24_13:14:54 Cleaning backups from old location... 2023-11-24_13:14:54 Cleaning previous backups... 2023-11-24_13:14:54 Creating new backup file... 2023-11-24_13:15:16 Backup done. 2023-11-24_13:15:16 Copy script to /var/lib/vrlcm 2023-11-24 13:16:40,265 - __main__ - INFO - patch installation process ended. 2023-11-24 13:16:40,265 - __main__ - INFO - patch installation completed. Post above message there would be a reboot of appliance. Then the work begins under bootstrap.log and vmware_vrlcm.log To validate from UI , one can check

  • What's VMware Identity Manager Cluster Auto-Recovery in VMware Aria Suite Lifecycle 8.14

    VMware Aria Suite Lifecycle 8.14 introduces an innovative capability known as "VMware Identity Manager Cluster Auto-Recovery". Why do we need this ? The aim of this 'autorecovery' service is to minimize the necessity to do the time-consuming 'Remediate' operation from the Suite Lifecycle UI. In greenfield deployments, this feature is automatically activated, while in brownfield deployments, it needs to be manually enabled after upgrading to VMware Aria Suite Lifecycle 8.14 How does it work? The 'autorecovery' service is deployed as a Linux service and operates on all three nodes within the vIDM cluster Operations like start/restart of the pgPool service is controlled by the script individually on each node The handling of cluster-VIP or delegateIP is handled only on nodes with the role as primary Detachment of VIO is done on standby nodes based on their role as "standby" Operations like "recovery" is synchronized base don the node's status as database "primary" and in one case and cluster "leader" if all nodes are in standby Because of which there would be no duplicate operations which are triggered by any of the nodes What challenges or issues does this feature tackle? Cluster-VIP loss on primary Cluster issues due to network outages Recovery of "down" cluster node/s Avoids necessity to initiate the "Remediate" from UI in most cases, which involves node(s) restarts contributing to downtime This script eliminates the necessity of rebooting vIDM nodes because of PostgreSQL cluster problems Recovery in cases with significant replication delay ('significant' configurable in bytes, say more than 1000 bytes of lag between primary and secondary) Recovery in rare cases of all nodes are in a ‘standby’ state Prevent discrepancies in the /etc/hosts Is there any downtime during the execution of auto-recovery? No, there is no downtime when the auto-recovery script is triggered in the backend for any of the reasons How do i enable and disable this feature? Once enabled , users can come to day-2 operations pane of globalenvironment or vIDM and then choose to disable and vice-versa

  • VMware Aria Suite Lifecycle Upgrade to 8.14 | Postupdate phase failure |

    This is a rare occurrance and might be seen on appliances which were deployed with fips mode disabled. Tried to explain this in a video recording hope this helps VMware KB Article to be followed : https://kb.vmware.com/s/article/95231 Always ensure there's a snapshot taken before upgrade Proactive Measure If your planning for an uprgade of Suite Lifecycle 8.12 to 8.14 then... 1. Create a file called fips_disable touch /var/log/bootstrap/fips_disable 2. Change permissions chmod 644 var/log/bootstrap/fips_disable 3. Go ahead and upgrade, should be seamless Reactive measure If you have hit the problem, then you can come out of it in 2 ways. KB https://kb.vmware.com/s/article/95231 lists both of the methods. Method 1 1. Revert Suite Lifecycle appliance to pre-upgrade snapshot 2. Create a file called /var/log/bootstrap/fips_disable: touch /var/log/bootstrap/fips_disable 3. Set the correct permission on the file chmod 644 /var/log/bootstrap/fips_disable 4. Retry the upgrade Method 2 Follow the instructions laid out under second section of resolution section of KB: https://kb.vmware.com/s/article/95231

  • How to overcome / partition space warning during pre-check of Suite Lifecycle Upgrade to 8.14

    There's a precheck which was slight enhance to a higher limit to ensure there is enough space available on / before going for an upgrade In some environments which's been running for a while you might encounter a warning / error which might tell you that there's not enough space. How do i resolve this issue so that i can upgrade? Follow this KB article : https://kb.vmware.com/s/article/95238 to fix this issue. I'll explain this in detail anyways below Let's begin Created a short video which explains the same. As a first step, take a snapshot before making any changes. This is mandatory Understand what's the current disk space looking like inside appliance Command: df -h Output: root@tvasl [ ~ ]# df -h Filesystem Size Used Avail Use% Mounted on devtmpfs 2.9G 0 2.9G 0% /dev tmpfs 3.0G 20K 3.0G 1% /dev/shm tmpfs 3.0G 792K 3.0G 1% /run tmpfs 3.0G 0 3.0G 0% /sys/fs/cgroup /dev/mapper/system-system_0 9.8G 3.9G 5.4G 42% / tmpfs 3.0G 18M 2.9G 1% /tmp /dev/mapper/storage-storage_0 9.8G 81M 9.2G 1% /storage /dev/mapper/vg_lvm_snapshot-lv_lvm_snapshot 7.8G 24K 7.4G 1% /storage/lvm_snapshot /dev/mapper/vg_alt_root-lv_alt_root 9.8G 24K 9.3G 1% /storage/alt_root /dev/sda3 488M 40M 412M 9% /boot /dev/mapper/data-data_0 49G 33G 14G 71% /data /dev/sda2 10M 2.0M 8.1M 20% /boot/efi This /dev/sda4 belongs to / partition In this above example , i do have ample of space but what if i don't have and what files should i delete to clear space Execute below command to check what are the files in the appliance taking up space above a certain size If i want to check file sizes above 100MB then i'll execute find / -xdev -type f -size +100M -exec du -sh {} ';' | sort -rh | head -n50 If i want to check file sizes above 50MB then i'll execute find / -xdev -type f -size +50M -exec du -sh {} ';' | sort -rh | head -n50 If i want to check file sizes above 25MB then i'll execute find / -xdev -type f -size +25M -exec du -sh {} ';' | sort -rh | head -n50 Now i'll get a list of files back , there might be some .jar files nad .bkp files. Along with it there must be some package-pool folder with bunch of files in it. So how do i decide on what to delete Following files in the folder are safe to delete. Let's discuss them in detail. The backups present in the following folder are created during product support pack installation. These are not cleared upon successful installation. /opt/vmware/vlcm/postgresbackups/ There will be only 1 backup available inside this folder at any point in time and you may delete this without any issues. One note for the future, once you upgrade to VMware Aria Suite Lifecycle 8.14 and you apply a Product Support Pack, the database backups will be taken and stored under /data partition. This will ensure there's always space available under / Files under blackstone backup folder are taken during previous Suite Lifecycle upgrades. Contents under this folder can be successfully deleted /opt/vmware/vlcm/blackstone_bkp When ever you upgrade the contents of blackstone or the brain behind Suite Lifecycle's Content Management feature will store it's previous versions file in this folder Files under package_pool can be deleted as well. These files are the rpm's extracted during your previous upgrade and staged. These are no longer needed. /opt/vmware/var/lib/vami/update/data/package-pool/package-pool/*.* Remember, if your using VMware Aria Suite Lifecycle 8.12 and going to 8.14 , the upgrades are powered by CAP platform CAP is known as Common Appliance Platform and it's a replacement for VAMI, popularly known as VMware Appliance Management Interface Deleting files under package-pool will clear-up lots of space After all there, re-run precheck and you should be fine with the upgrades.

  • Upgrading VMware Aria Suite Lifecycle | 8.12 to 8.14 | Demo |

    Pre-Requisites Take Snapshots Review Release notes Video Important Logs Pre-Upgrade & Upgrade Phase /var/log/vmware/capengine/cap-non-lvm-update/worflow.log /var/log/vmware/capenginecap-non-lvm-update/installer-<>.log Post Update Phase /var/log/bootstrap/postupdate.log /data/script.log /var/log/vrlcm/vmware_vrlcm.log

  • Upgrading VMware Aria Suite Lifecycle | 8.12 to 8.14 | Deepdive |

    Pre-Requisites Take Snapshots Review Release notes Note : For VCF aware VMware Aria Suite Lifecycle environments , wait for respective PSPACK or Product Support Packs to be released Important Logs Pre-Upgrade & Upgrade Phase /var/log/vmware/capengine/cap-non-lvm-update/worflow.log /var/log/vmware/capenginecap-non-lvm-updateinstaller-<>.log Post Update Phase /var/log/bootstrap/postupdate.log /data/script.log /var/log/vrlcm/vmware_vrlcm.log Procedure We may use one of the repository methods to upgrade Check Online URL CD-Rom In this example/demo , we will be using CD Rom method Let's now deepdive and understand the uprgade procedure Ignore the message which states that Suite Lifecycle is already upgraded. That will be removed in next version Click on CD-Rom and then check for upgrades It now reads the manifest and comes back stating that the uprgade is available Give your conscent , agree that you have taken snapshot Validations should go through What do we check Mandatory value check Root password check Disk space check on / filesystem Requests in-progress check VMware Aria Suite Lifecycle health check VMware Identity Manager health check Now click on Upgrade Upgrade is now triggered It would stay on 22% for a while as the rpm's are being verfified and staged into repo Reference: /var/log/vmware/capengine/cap-non-lvm-update/worflow.log Prechecks 2023/10/20 11:09:42.479602 workflow_manager.go:84: Fetching metadata for workflow 2023/10/20 11:09:42.480148 workflow_manager.go:354: Updating instance status to Running 2023/10/20 11:09:42.480601 task_progress.go:24: Starting task update-precheck * * 2023/10/20 11:09:43.171868 task_progress.go:24: Reading appliance metadata complete. 2023/10/20 11:09:43.185705 workflow_manager.go:221: Task precheck_metadata completed Staging 2023/10/20 11:09:43.186514 task_progress.go:24: Starting task stage 2023/10/20 11:09:43.250662 stage_plugin.go:124: Using stage directory: /storage/eb595996-31c9-432c-933e-fa354438df65/stage 2023/10/20 11:09:43.267429 progress.go:11: Staging update * * 2023/10/20 11:10:43.268446 task_progress.go:24: Updates staged successfully 2023/10/20 11:10:43.288182 workflow_manager.go:221: Task stage completed Post staging , goes ahead with pre installation scripts 2023/10/20 11:10:43.288321 task_progress.go:24: Starting task ext-pre-install 2023/10/20 11:10:44.745400 task_progress.go:24: Finished executing pre install extension script 2023/10/20 11:10:44.757120 workflow_manager.go:221: Task ext-pre-install completed Validates Installation 2023/10/20 11:10:44.757212 task_progress.go:24: Starting task validate-install 2023/10/20 11:10:50.381346 workflow_manager.go:221: Task validate-install completed Moves to 55% when the rpm installation begins Starts Installing RPM's 2023/10/20 11:10:50.381492 task_progress.go:24: Starting task install 2023/10/20 11:10:50.647395 progress.go:11: Installing RPMs 2023/10/20 11:10:50.658173 installer.go:44: Installing RPMs 2023/10/20 11:10:50.658307 task_progress.go:24: Installing RPMs Remember, this is the stage where you can check install-<>.log as well. That's the lenghty log which installs all rpm's so will not describe it in detail ufdio: 1 reads, 17154 total bytes in 0.000013 secs D: ============== /storage/f5e77458-7b7f-49c8-a66f-d0bf3b0cca38/stage/apache-ant-1.10.12-2.ph3.noarch.rpm D: loading keyring from pubkeys in /var/lib/rpm/pubkeys/*.key D: couldn't find any keys in /var/lib/rpm/pubkeys/*.key D: loading keyring from rpmdb D: opening db environment /var/lib/rpm cdb:0x401 D: opening db index /var/lib/rpm/Packages 0x400 mode=0x0 * * * * D: closed db index /var/lib/rpm/Providename D: closed db index /var/lib/rpm/Requirename D: closed db index /var/lib/rpm/Group D: closed db index /var/lib/rpm/Basenames D: closed db index /var/lib/rpm/Name D: closed db environment /var/lib/rpm D: Exit status: 0 Once the installer log states Exit staus to 0 , it means that the upgrade part is now complete. Workflow logs state that it has finished installing the RPM's 2023/10/20 11:15:27.232130 installer.go:32: Rebuilding RPM database 2023/10/20 11:15:27.875407 installer.go:57: Finished RPM installation 2023/10/20 11:15:27.875461 progress.go:11: Finished installing RPMs 2023/10/20 11:15:27.875762 task_progress.go:24: Finished installing RPMs 2023/10/20 11:15:27.946718 workflow_manager.go:221: Task install completed Post this the post installation script starts Reference: /var/log/vmware/capengine/cap-non-lvm-update/worflow.log 2023/10/20 13:10:40.708519 task_progress.go:24: Starting task ext-post-install 2023/10/20 13:10:41.324567 progress.go:11: Starting to execute post install extension script 2023/10/20 13:10:41.334470 command_exec.go:49: DEBUG running command: /bin/bash -c /var/tmp/f5e77458-7b7f-49c8-a66f-d0bf3b0cca38/post-install-script.sh 8.12.0.7 8.14.0.4 0 2023/10/20 13:10:41.334522 task_progress.go:24: Starting to execute post install extension script When it starts executing post install script, we need to check a 2 different logs, postupdate.log shown below to begin with Reference: /var/log/bootstrap/postupdate.log During postupdate phase it checks *** Begins Postupdate *** *** RPM Checks *** *** CAP User Creation *** *** Disable Password Expiration *** *** Update ulimit *** *** Set Python *** *** Postgres Configuration *** *** Cleanup Inprogress Requests *** *** Starts Services *** 2023-10-20 11:15:28 /etc/bootstrap/postupdate.d/25-start-services starting... + set -e + echo 'Reboot not required so starting all service.' Reboot not required so starting all service. + systemctl daemon-reload + systemctl restart vpostgres + touch /var/log/bootstrap/reboot-required + systemctl restart vrlcm-server + systemctl restart blackstone-spring + cp -f /var/lib/vrlcm/dev-build-upgrade.sh /usr/local/bin/vrlcm-cli + chmod 700 /usr/local/bin/vrlcm-cli + cp -r /var/lib/vrlcm/nginx.conf /etc/nginx/ + cp -r /var/lib/vrlcm/ssl.conf /etc/nginx/ + systemctl reload nginx + rm -f /var/lib/vrlcm/SUCCESS + rm -rf /tmp/dlfRepo + [[ ! -f /etc/triggerLicenseUpdate ]] + touch /etc/triggerLicenseUpdate + [[ -f /var/log/vrlcm/status.txt ]] + rm -rf /var/log/vrlcm/status.txt_backup + cp -r /var/log/vrlcm/status.txt /var/log/vrlcm/status.txt_backup + rm -rf /var/log/vrlcm/status.txt + /var/lib/vrlcm/populate.sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 You would see the progress on postupdate will wait for sometime for populate.sh script to run This. is the time you may check following logs Reference : /data/script.log There are bunch of updates which go through in the database checking all service status checking services are running checking dependent service status % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed ^M 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0^M 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 * * delete sucess api count: 176: delete failed api count: 0: Starting postgres update script for vRSLCM specific tables ALTER TABLE ALTER TABLE Disabling FIPS settings ALTER TABLE UPDATE 0 After populate.sh script is complete, then go ahead and check postupdate.log + [[ -f /var/lib/vrlcm/SUCCESS ]] + echo 'Creating INPROGRESS to block UI from loading...' Creating INPROGRESS to block UI from loading... + rm -rf /var/lib/vrlcm/SUCCESS + touch /var/lib/vrlcm/INPROGRESS + exit 0 2023-10-20 11:22:58 /etc/bootstrap/postupdate.d/25-start-services done, succeeded. *** Service Startups Conclude *** *** Cleaning-up PSPACK db's starts *** *** Fill CAP product info *** *** Updates upgrade status *** *** blackstone upgrade starts *** *** cleanup patch history *** *** Enable fips mode *** *** Create CAP update settings *** *** Disable VAMI *** *** Creates flag to reboot VA *** *** Postupdate task is now complete *** Since postupdate task. is now complete , we shall now see the whole upgrade procedure completed by CAPENGINE and documented under workflow.log 2023/10/20 11:15:28.274925 command_exec.go:49: DEBUG running command: /bin/bash -c /var/tmp/eb595996-31c9-432c-933e-fa354438df65/post-install-script.sh 8.12.0.7 8.14.0.4 0 2023/10/20 11:22:59.237676 non_lvm_update_post_update_script_plugin.go:84: Post install extension script output Updating vami-sfcb.service - Removing vmtoolsd service dependency already service is having restart policy Finished installing version 8.14.0.4 2023/10/20 11:22:59.237734 progress.go:11: Finished executing post install extension script 2023/10/20 11:22:59.238148 task_progress.go:24: Finished executing post install extension script 2023/10/20 11:22:59.249143 workflow_manager.go:221: Task ext-post-install completed 2023/10/20 11:22:59.249404 task_progress.go:24: Starting task metadata_update 2023/10/20 11:22:59.870305 progress.go:11: Updating appliance metadata 2023/10/20 11:22:59.872860 task_progress.go:24: Updating appliance metadata 2023/10/20 11:23:00.044506 progress.go:11: Metadata update completed 2023/10/20 11:23:00.044835 task_progress.go:24: Metadata update completed 2023/10/20 11:23:00.059053 workflow_manager.go:221: Task metadata_update completed 2023/10/20 11:23:00.059324 task_progress.go:24: Starting task cleanup 2023/10/20 11:23:00.289545 progress.go:11: Removing stage path 2023/10/20 11:23:00.306077 task_progress.go:24: Removing stage path 2023/10/20 11:23:00.306810 cleanup.go:64: Removing update directory: /storage/eb595996-31c9-432c-933e-fa354438df65 2023/10/20 11:23:00.912667 cleanup.go:72: Successfully removed update location 2023/10/20 11:23:00.983155 workflow_manager.go:221: Task cleanup completed 2023/10/20 11:23:00.983189 workflow_manager.go:183: All tasks finished for workflow 2023/10/20 11:23:00.983208 workflow_manager.go:354: Updating instance status to Completed This concludes the upgrade of VMware Aria Suite Lifecycle to version 8.14

  • Moving VMware Aria Operations from one VMware Aria Suite Lifecycle to another

    Usecase Explaination of the steps taken if one wants to move VMware Aria Operations which is integrated with vIDM or Globalenvironment from one Suite Lifecycle to another Note: VMware Aria Suite Lifecycle on both Source and Destination should be same , include policy Environment Here are the details of the environment we have on source and destination. The domain names taken here are an example and does not represent any organization Product UI The product has 2 auth sources Local Users and vIDMAuthSource You can see when we select vIDMAuthSource , it does redirect it to vIDM and we can login using configadmin which is my local vIDM based auth account Auth Source is configured to vidm as shown below Procedure Phase 1 : Removal from Source VMware Aria Suite Lifecycle Here's the Operations instance which i'd like to move it to a different Suite Lifecycle We may see that the Operations instance is integrated with VMware Identity Manager On globalenvironment or VMware Identity Manager we can see this environment as a reference In order to move this Operations instance to a different VMware Aria Suite Lifecycle instance, i'll have to remove this from Suite Lifecycle's inventory Remember, I am deleting environment because this is the only product in the environment. If i have multiple products in an environment as shown in the next pane , then i would only delete that specific product Delete environment will present me with the following screen. I shall select the first option where it removes the environment / product from VMware Aria Suite Lifecycle and not from vCenter. If we select "Delete associated VMs from vCenter" it would delete all associated virtual machines/appliances from the vCenter causing an outage. So let's select the first option to delete the environment from just VMware Aria Suite Lifecycle and submit the request Environment is deleted from VMware Aria Suite Lifecycle The request is now complete The references in globalenvironment is removed too This does not mean it will delete the integration it has with the vIDM whcih it had before that shall still remain Before moving to the destination Suite Lifecycle, i shall download the certificate being consumed by Operations in the Source Suite Lifecycle and keep it aside to be imported into the destination Suite Lifecycle Using this information we shall import this key into the destination Suite Lifecycle before we import the Operations product into it. So that when the product is imported , the certificate mapping is perfect. Phase-2 : Importing to Destination VMware Aria Suite Lifecycle Let's import the certificate into Suite Lifecycle first. This is the Operations certificate we downloaded just few steps before Al i did was to point to the downloaded pem file and it automatically detects the cert and it's private key to import Once we click on import the certificate is now imported globalenvironment or VMware Identity Manager on the destination is a distributed node Let's import the product into destination Suite Lifecycle Click on "Create Environment" to start the import process Select VMware Aria Operations as a product to import and then click next Because i am importing an existing product , there not much of information i need to enter in the next pane . All i need to enter is the master node's fqdn , select passwords and then choose the vCenter it's located on Click on next to review the summary and then submit the request Request is submitted Import request is now complete and now we can see the new environment in the destination Suite Lifecycle Remember the certificate we imported just before the product import. It is now marked as used it is mapped to the imported product If you clearly observe , the imported Operations instance is still pointing to the source VMware Identity Manager If we clearly observe the vIDM integration is set to false as it is not integrated with the vIDM in this Suite Lifecycle In the next phase we will add the new vIDM as an auth source and then remove existing vIDM auth source This should be done from VMware Aria Operations UI Phase 3: Replacing Auth Source in VMware Aria Operations Before making any changes take a snapshot Login into VMware Aria Operations as admin Browse to Administration and Authentication sources As you can see it's currently pointing to source vIDM Make sure the roles and the groups to which the roles are given are taken down. So that the same groups can be readded again Delete the authentication source Go back to Suite Lifecycle and perform an inventory sync As one can see the vIDM information is now gone Now let's go and add a new auth source in Operations as shown below. This will be pointing to the new globalenvironment on the destination Suite Lifecycle Enter appropriate information and then click on test Accept Certificate , once test connection is successful. Click on OK to save the config Now the auth is pointing to the new vIDM When i logout and check for the vIDMAuthSource now , it points to the new vIDM I'd now map the group back in Operations and then give the same role or access to the user Phase 4: Inventory Sync to reflect appropriate properties in Suite Lifecycle Get back to Suite Lifecycle and perform inventory sync. After performing inventory sync the properties are now updated The reference is shown as well This concludes the blog

  • Deploying Clustered VMware Aria Operations for Logs through Suite Lifecycle

    Goal Deploy a clustered VMware Aria Operations for Logs (formerly vRealize Log Insight) Environment Logs to monitor Deployment Process Create an environment in VMware Aria Suite Lifecycle which would host VMware Aria Operations for Logs product Accept the EULA Selected appropriate certificate Provide infrastructure details Enter details for the product to be deployed. We are integrating it with vIDM and also suppluing loadbalancer VIP for a distributed operations for logs configuration. Now run a precheck . Remember, all prechecks should be successful. Review the summary Click on submit to move forward with the product deployment There are 9 stages for a 3 node Operations for Logs deployment Different stages are Total deployment time taken is around 52 minutes and 2 seconds Behind the scenes VMware Aria Suite Lifecycle perspective Reference: /var/log/vrlcm/vmware_vrlcm.log Checking the certificate replacement task /** Certificate Init Task Starts here **/ 2023-07-24 02:35:55.019 INFO [pool-3-thread-6] c.v.v.l.p.v.VrliCertificateInitTask - -- Starting :: VMware Aria Operations for Logs Install Certificate Init Task.... 2023-07-24 02:35:55.019 INFO [pool-3-thread-6] c.v.v.l.p.v.VrliCertificateInitTask - -- certificate : locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs 2023-07-24 02:35:55.019 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVrliCertificateInitTaskCompleted /** In the product spec, we have the locker object information for certificate, vidm information and opslogs information **/ 2023-07-24 02:35:55.020 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- ======================================== { "productSpec" : { "object" : null, "originalObject" : null }, "componentSpec" : { "object" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, "componentVersion" : null, "properties" : { "contentLibraryId" : null, "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "certificateId" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "baseTenantId" : "base tenant", "certificate" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "workerNodeIP" : "XX.XX.XX.18,XX.XX.XX.19", "vrliAdminEmail" : "conifigadmin@vsphere.local", "__version" : "8.12.0", "nodeSize" : "small", "masterVidmAdminUserName" : "admin", "environmentName" : "Operations for Logs", * * * * "vrliClusterVips" : "XX.XX.XX.16#lilb.cap.org", "isUpgradeVmCompatibility" : "true", "authProviderHostnames" : "vidmlb.cap.org,vidmtwo.cap.org,vidmone.cap.org,vidmthree.cap.org", "vidmPrimaryNodeHostname" : "vidmone.cap.org", "appIcon" : "vidm_driver_images/vrli.png", "appName" : "Operations for Logs_vRLI_8.12.0_XX.XX.XX.17", "targetUrl" : "https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login" } }, "priority" : 4 }, "originalObject" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, "componentVersion" : null, "properties" : { "contentLibraryId" : null, "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "certificateId" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "baseTenantId" : "base tenant", "certificate" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "workerNodeIP" : "XX.XX.XX.18,XX.XX.XX.19", "vrliAdminEmail" : "conifigadmin@vsphere.local", "__version" : "8.12.0", "nodeSize" : "small", "masterVidmAdminUserName" : "admin", "environmentName" : "Operations for Logs", "vrliHostName" : "lione.cap.org", "masterNodeIP" : "XX.XX.XX.17", "__productId" : "vrli", * * * * "authProviderHostnames" : "vidmlb.cap.org,vidmtwo.cap.org,vidmone.cap.org,vidmthree.cap.org", "vidmPrimaryNodeHostname" : "vidmone.cap.org", "appIcon" : "vidm_driver_images/vrli.png", "appName" : "Operations for Logs_vRLI_8.12.0_XX.XX.XX.17", "targetUrl" : "https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login" } }, "priority" : 4 } } } /** Complete Product Spec **/ 2023-07-24 02:35:55.021 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- FIELD NAME :: productSpec 2023-07-24 02:35:55.021 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- KEY PICKER IS NULL 2023-07-24 02:35:55.022 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- ======================================== { "productSpec" : { "object" : null, "originalObject" : null }, "componentSpec" : { "object" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, "componentVersion" : null, "properties" : { "contentLibraryId" : null, "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "certificateId" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "baseTenantId" : "base tenant", "certificate" : "locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs", "workerNodeIP" : "XX.XX.XX.18,XX.XX.XX.19", "vrliAdminEmail" : "conifigadmin@vsphere.local", "__version" : "8.12.0", "nodeSize" : "small", "masterVidmAdminUserName" : "admin", "environmentName" : "Operations for Logs", "vrliHostName" : "lione.cap.org", "masterNodeIP" : "XX.XX.XX.17", "__productId" : "vrli", "ntp" : "****.****", "uberAdminUserType" : "LOCAL", "masterVidmAdminPassword" : "JXJXJXJX", "vrliAdminPassword" : "JXJXJXJX", "uberAdmin" : "configadmin", "masterVidmEnabled" : "true", "uberAdminPassword" : "JXJXJXJX", "masterVidmHostName" : "vidmlb.cap.org", "fipsMode" : "False", "timeSyncMode" : "ntp", "isVcfUser" : "false", "isVcfEnabledEnv" : "false", "vrliClusterVips" : "XX.XX.XX.16#lilb.cap.org", "isUpgradeVmCompatibility" : "true", "authProviderHostnames" : "vidmlb.cap.org,vidmtwo.cap.org,vidmone.cap.org,vidmthree.cap.org", "vidmPrimaryNodeHostname" : "vidmone.cap.org", "appIcon" : "vidm_driver_images/vrli.png", "appName" : "Operations for Logs_vRLI_8.12.0_XX.XX.XX.17", "targetUrl" : "https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login" } }, "priority" : 4 }, "originalObject" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, "componentVersion" : null, "properties" : { "contentLibraryId" : null, * * * * "priority" : 4 } } } /** Component Spec **/ 2023-07-24 02:35:55.023 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- FIELD NAME :: componentSpec 2023-07-24 02:35:55.023 INFO [pool-3-thread-6] c.v.v.l.p.a.s.Task - -- KEY PICKER IS :: com.vmware.vrealize.lcm.drivers.commonplugin.task.keypicker.GenericComponentSpecKeyPicker 2023-07-24 02:35:55.537 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "91cec54d-57df-4dc5-9f3b-b9a1ea11dda5", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1690166155030, "lastUpdatedOn" : 1690166155533, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnVrliCertificateInitTaskCompleted", "currentState" : null, "eventArgument" : "{\"certificate\":{\"name\":\"certificate\",\"type\":\"java.lang.String\",\"value\":\"\\\"locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs\\\"\"},\"productSpec\": * * * * "errorCause" : null, "sequence" : 3454, "eventLock" : 1, "engineNodeId" : "devvasl.cap.org" } /** Certificate Init Task Completes **/ 2023-07-24 02:35:55.546 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vrlicluster 2023-07-24 02:35:55.546 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vrlicluster 2023-07-24 02:35:55.546 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2023-07-24 02:35:55.599 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vrlicluster.vmfx 2023-07-24 02:35:55.602 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2023-07-24 02:35:55.603 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2023-07-24 02:35:55.603 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vrlicluster.vmfx 2023-07-24 02:35:55.603 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vrlicluster.vmfx 2023-07-24 02:35:55.603 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vrlicluster.vmfx 2023-07-24 02:35:55.605 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vrlicluster', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vrlicluster.vmfx'} 2023-07-24 02:35:55.606 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVrliCertificateInitTaskCompleted 2023-07-24 02:35:55.606 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.vrli.VrliCertificateInitTask 2023-07-24 02:35:55.606 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.vrli.VrliImportCertificateTask 2023-07-24 02:35:55.611 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.vrli.VrliImportCertificateTask 2023-07-24 02:35:55.613 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2023-07-24 02:35:55.614 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2023-07-24 02:35:55.614 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2023-07-24 02:35:55.620 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2023-07-24 02:35:55.622 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2023-07-24 02:35:55.625 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vrliAdminPassword<=KXKXKXKX KEY :: XXXXXX 2023-07-24 02:35:55.628 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: certificate 2023-07-24 02:35:55.628 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- LOCKER OBJECT: TYPE :: certificate <===> TARGET :: certificateDto 2023-07-24 02:35:55.630 INFO [scheduling-1] c.v.v.l.l.s.p.PKIStoreServiceImpl - -- Reading certificate 'f33c54a7-efd1-4609-9b4b-8d1010ce368c' from [PKIStoreService] 2023-07-24 02:35:55.650 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2023-07-24 02:35:55.651 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. /** Certificate Import Task starts **/ 2023-07-24 02:35:55.659 INFO [pool-3-thread-8] c.v.v.l.p.v.VrliImportCertificateTask - -- Starting :: VMware Aria Operations for Logs import certificate task 2023-07-24 02:35:55.660 INFO [pool-3-thread-8] c.v.v.l.p.v.VrliImportCertificateTask - -- Called from Install flow 2023-07-24 02:35:56.019 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- Return message for VMware Aria Operations for Logs: {"releaseName":"GA","version":"8.12.0-21696970"} 2023-07-24 02:35:56.020 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- Return status code for VMware Aria Operations for Logs: 200 2023-07-24 02:35:56.026 INFO [pool-3-thread-8] c.v.v.l.p.v.VrliImportCertificateTask - -- Version >= 4.6.0. Applying certificate for the newly deployed VMware Aria Operations for Logs 2023-07-24 02:35:56.027 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- Checking if VMware Aria Operations for Logs instance is running 2023-07-24 02:35:56.105 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- The VMware Aria Operations for Logs instance https://XX.XX.XX.17 service is running 2023-07-24 02:36:03.690 INFO [http-nio-8080-exec-9] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-24 02:37:03.710 INFO [http-nio-8080-exec-5] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-24 02:37:39.993 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- certificate api response: statuscode = 201 2023-07-24 02:37:39.997 INFO [pool-3-thread-8] c.v.v.l.d.v.InstallConfigureVRLI - -- certificate api response: message = Created /** Certificate is uploaded and completion is called **/ 2023-07-24 02:37:39.997 INFO [pool-3-thread-8] c.v.v.l.p.v.VrliImportCertificateTask - -- Certificate uploaded successfully 2023-07-24 02:37:39.997 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVRLIUpdateCertificateCompletion 2023-07-24 02:37:39.998 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- ======================================== { "productSpec" : { "object" : null, "originalObject" : null }, "componentSpec" : { "object" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, * * * * Logs_vRLI_8.12.0_XX.XX.XX.17", "targetUrl" : "https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login" } }, "priority" : 4 } } } 2023-07-24 02:37:40.000 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- FIELD NAME :: productSpec 2023-07-24 02:37:40.000 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- KEY PICKER IS NULL 2023-07-24 02:37:40.000 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- ======================================== { "productSpec" : { "object" : null, "originalObject" : null }, "componentSpec" : { "object" : { "component" : { "symbolicName" : "vrlicluster", "type" : null, "componentVersion" : null, * * * * "targetUrl" : "https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login" } }, "priority" : 4 } } } 2023-07-24 02:37:40.002 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- FIELD NAME :: componentSpec 2023-07-24 02:37:40.002 INFO [pool-3-thread-8] c.v.v.l.p.a.s.Task - -- KEY PICKER IS :: com.vmware.vrealize.lcm.drivers.commonplugin.task.keypicker.GenericComponentSpecKeyPicker 2023-07-24 02:37:40.534 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "e91c6812-e7a5-40b9-aa3d-4b4983369a96", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1690166260003, "lastUpdatedOn" : 1690166260530, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnVRLIUpdateCertificateCompletion", "currentState" : null, "eventArgument" : "{\"productSpec\":{\"name\":\"productSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ProductSpecification\",\"value\":\"null\"},\"componentSpec\":{\"name\":\"componentSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ComponentDeploymentSpecification\",\"value\":\"{\\\"component\\\":{\\\"symbolicName\\\":\\\"vrlicluster\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"contentLibraryId\\\":null,\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"certificateId\\\":\\\"locker:KXKXKXKX\\\",\\\"baseTenantId\\\":\\\"base tenant\\\",\\\"certificate\\\":\\\"locker:certificate:f33c54a7-efd1-4609-9b4b-8d1010ce368c:opslogs\\\",\\\"workerNodeIP\\\":\\\"XX.XX.XX.18,XX.XX.XX.19\\\",\\\"vrliAdminEmail\\\":\\\"conifigadmin@vsphere.local\\\",\\\"__version\\\":\\\"8.12.0\\\",\\\"nodeSize\\\":\\\"small\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"environmentName\\\":\\\"Operations for Logs\\\",\\\"vrliHostName\\\":\\\"lione.cap.org\\\",\\\"masterNodeIP\\\":\\\"XX.XX.XX.17\\\",\\\"__productId\\\":\\\"vrli\\\",\\\"ntp\\\":\\\"****.****\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"vrliAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"masterVidmHostName\\\":\\\"vidmlb.cap.org\\\",\\\"fipsMode\\\":\\\"False\\\",\\\"timeSyncMode\\\":\\\"ntp\\\",\\\"isVcfUser\\\":\\\"false\\\",\\\"isVcfEnabledEnv\\\":\\\"false\\\",\\\"vrliClusterVips\\\":\\\"XX.XX.XX.16#lilb.cap.org\\\",\\\"isUpgradeVmCompatibility\\\":\\\"true\\\",\\\"authProviderHostnames\\\":\\\"vidmlb.cap.org,vidmtwo.cap.org,vidmone.cap.org,vidmthree.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"vidmone.cap.org\\\",\\\"appIcon\\\":\\\"vidm_driver_images/vrli.png\\\",\\\"appName\\\":\\\"Operations for Logs_vRLI_8.12.0_XX.XX.XX.17\\\",\\\"targetUrl\\\":\\\"https://vidmlb.cap.org:443/SAAS/auth/oauth2/authorize?response_type=code&client_id=116b0383-9fc3-43ee-aff6-a1e753584f35&redirect_uri=https://lilb.cap.org/login\\\"}},\\\"priority\\\":4}\"}}", "status" : "CREATED", "stateMachineInstance" : "1adef09c-09d0-4a16-bd6d-036db4eb2ce7", "errorCause" : null, "sequence" : 3456, "eventLock" : 1, "engineNodeId" : "devvasl.cap.org" } 2023-07-24 02:37:40.542 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vrlicluster 2023-07-24 02:37:40.542 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vrlicluster 2023-07-24 02:37:40.542 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2023-07-24 02:37:40.568 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vrlicluster.vmfx 2023-07-24 02:37:40.568 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2023-07-24 02:37:40.569 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2023-07-24 02:37:40.569 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vrlicluster.vmfx 2023-07-24 02:37:40.569 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vrlicluster.vmfx 2023-07-24 02:37:40.569 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vrlicluster.vmfx 2023-07-24 02:37:40.572 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vrlicluster', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vrlicluster.vmfx'} 2023-07-24 02:37:40.573 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVRLIUpdateCertificateCompletion 2023-07-24 02:37:40.573 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.vrli.VrliImportCertificateTask 2023-07-24 02:37:40.574 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask 2023-07-24 02:37:40.581 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask Operations for Log perspective api_audit.log [2023-07-24 02:35:56.476+0000] ["application-akka.actor.default-dispatcher-21"/1X.XX.XX.XX INFO] [com.vmware.loginsight.api.Authentication.AuthenticatedActionBase] [admin calls POST /api/v1/certificate] [2023-07-24 02:35:56.485+0000] ["application-akka.actor.default-dispatcher-21"/1X.XX.XX.XX INFO] [com.vmware.loginsight.api.Authentication.AuthenticatedActionBase] [admin calls POST /api/v1/certificate] runtime.log [2023-07-24 02:35:35.445+0000] ["SslCertificateManagerScheduler-thread-1"/1X.XX.XX.XX INFO] [com.vmware.loginsight.daemon.shared.ssl.SslCertificateManager] [checkAndUpdateTruststore--] [2023-07-24 02:35:35.448+0000] ["SslCertificateManagerScheduler-thread-1"/1X.XX.XX.XX INFO] [com.vmware.loginsight.commons.security.UrlConnectionManager] [Loading truststore from path: /usr/java/jre-vmware/lib/security/cacerts] [2023-07-24 02:35:56.486+0000] ["application-akka.actor.default-dispatcher-21"/1X.XX.XX.XXINFO] [com.vmware.loginsight.api.providers.certificate.CertificateProvider] [Saving new custom SSL certificate] [2023-07-24 02:36:35.450+0000] ["SslCertificateManagerScheduler-thread-1"/1X.XX.XX.XX INFO] [com.vmware.loginsight.commons.security.UrlConnectionManager] [Loading truststore from path: /usr/java/jre-vmware/lib/security/cacerts] [2023-07-24 02:37:32.717+0000] ["DaemonCommands-thread-14"/1X.XX.XX.XX INFO] [com.vmware.loginsight.daemon.shared.ssl.SslCertificateManager] [SSL script result: [exitCode=0, stdOut=, stdErr=Importing keystore /usr/lib/loginsight/application/etc/certs/keystore.pkcs12 to /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore... Warning: The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore -destkeystore /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore -deststoretype pkcs12". Certificate was added to keystore Importing keystore /usr/lib/loginsight/application/etc/certs/keystore.pkcs12 to /usr/lib/loginsight/application/etc/3rd_config/keystore... Warning: The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore /usr/lib/loginsight/application/etc/3rd_config/keystore -destkeystore /usr/lib/loginsight/application/etc/3rd_config/keystore -deststoretype pkcs12".]] [2023-07-24 02:37:32.717+0000] ["DaemonCommands-thread-14"/1X.XX.XX.XX WARN] [com.vmware.loginsight.daemon.shared.ssl.SslCertificateManager] [Importing keystore /usr/lib/loginsight/application/etc/certs/keystore.pkcs12 to /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore... Warning: The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore -destkeystore /usr/lib/loginsight/application/3rd_party/apache-tomcat-8.5.87/conf/keystore -deststoretype pkcs12". Certificate was added to keystore Importing keystore /usr/lib/loginsight/application/etc/certs/keystore.pkcs12 to /usr/lib/loginsight/application/etc/3rd_config/keystore... Warning: The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore /usr/lib/loginsight/application/etc/3rd_config/keystore -destkeystore /usr/lib/loginsight/application/etc/3rd_config/keystore -deststoretype pkcs12".] [2023-07-24 02:37:32.717+0000] ["DaemonCommands-thread-14"/1X.XX.XX.XX INFO] [com.vmware.loginsight.daemon.shared.ssl.SslCertificateManager] [Syncing certificate] [2023-07-24 02:37:35.445+0000] ["SslCertificateManagerScheduler-thread-1"/1X.XX.XX.XX INFO] [com.vmware.loginsight.daemon.shared.ssl.SslCertificateManager] [checkAndUpdateTruststore--] [2023-07-24 02:37:35.450+0000] ["SslCertificateManagerScheduler-thread-1"/1X.XX.XX.XX INFO] [com.vmware.loginsight.commons.security.UrlConnectionManager] [Loading truststore from path: /usr/java/jre-vmware/lib/security/cacerts]

  • VMware Identity Manager Upgrade from 3.3.6 to 3.3.7 | Deepdive |

    Goal Upgrade globalenvironment or VMware Identity Manager from version 3.3.6 to 3.3.7 For this demo, we are using vRSLCM 8.10 PSPACK 15 Environment Pre-requisites Take snapshot of VMware Identity Manager globalenvironment product properties take snapshot using "create snapshot" day-2 action of vRSLCM takes around 58 minutes to complete the snapshot task Perform Inventory Sync Upgrade before request submission phase In order to perform an upgrade, we need the upgraderepo downloaded Once downloaded, we will go ahead and trigger upgrade of vIDM from lifecycle operations pane of Suite Lifecycle An inventory sync must be performed here as well in order to proceed Once Inventory Sync is complete we can click on "proceed" to start the process Remember we already have the upgrade repo downloaded so let's point it to the same link You may opt for product snpashot and also to retain the snapshot after an upgrade Clicking on next will ask you to run a precheck SSH connectivity Disk space on / Node health Opened ports Root Password expiration Version check Sizing check are the ones which are verified during precheck Prechecks are successful as shown above upgrade triggered phase Review the summary before submitting the request and then click on submit to initiate uprgade Upgrade is now triggered Multiple stages during upgrade are shown as below. Remember the timings mentioned below are from my lab, these might vary in your environment. You might see some steps repeating as it has to perform upgrade on 3 VMware Identity Manager nodes one after another vRSLCM logs deep-dive w.r.t vIDM Upgrade Reference: /var/log/vrlcm/vmware_vrlcm.log We will specifically look into Stages 13 to 21 where the upgrade and post upgrade steps occur //**** Upgrade vIDM Offline state machine ( stage 11 ) **** // To begin with old updateoffline.hzn is deleted and new one is copied and permissions are changed 2023-07-22 12:05:26.565 INFO [pool-3-thread-23] c.v.v.l.v.c.t.VidmUploadUpgradeScriptTask - -- Starting :: vIDM upload upgrade script Task 2023-07-22 12:05:26.567 INFO [pool-3-thread-23] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property vidmUpgradeBasePath value is obtained from Config Service : /usr/local/horizon/update/ 2023-07-22 12:05:26.567 INFO [pool-3-thread-23] c.v.v.l.v.c.t.VidmUploadUpgradeScriptTask - -- Target version is 3.3.3 and above, loading new offline upgrade script. 2023-07-22 12:05:26.568 INFO [pool-3-thread-23] c.v.v.l.v.c.t.VidmUploadUpgradeScriptTask - -- Upgrade script source folder path :: newupgradescript/ 2023-07-22 12:05:26.667 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:05:26.668 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:26.668 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command: ls /usr/local/horizon/update/updateoffline.hzn 2023-07-22 12:05:26.668 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:27.260 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:05:27.260 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:05:27.261 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "/usr/local/horizon/update/updateoffline.hzn\n", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:05:27.268 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Found file for upload: /tmp/updateoffline.hzn 2023-07-22 12:05:27.393 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Uploading file --> ssh://root@vidmone.cap.org/usr/local/horizon/update/ 2023-07-22 12:05:27.740 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Uploaded file sucessfully 2023-07-22 12:05:27.742 INFO [pool-3-thread-23] c.v.v.l.v.c.t.VidmUploadUpgradeScriptTask - -- Uploaded file to the location : /tmp/updateoffline.hzn 2023-07-22 12:05:27.935 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:05:27.937 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:27.938 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command: chmod 500 /usr/local/horizon/update/updateoffline.hzn 2023-07-22 12:05:27.939 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:28.520 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:05:28.520 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:05:28.521 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:05:28.522 INFO [pool-3-thread-23] c.v.v.l.v.c.t.VidmUploadUpgradeScriptTask - -- File: /tmp/updateoffline.hzn deleted successfully. 2023-07-22 12:05:28.612 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:05:28.613 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:28.613 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command: sed -i '/^ *read$/d' /usr/local/horizon/update/updateoffline.hzn 2023-07-22 12:05:28.613 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:05:29.119 INFO [http-nio-8080-exec-2] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-22 12:05:29.212 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:05:29.212 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:05:29.213 INFO [pool-3-thread-23] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:05:29.213 INFO [pool-3-thread-23] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmUploadUpgradeScriptCompletion vIDM bundle upload task is initiated and completed 2023-07-22 12:05:29.563 INFO [pool-3-thread-26] c.v.v.l.v.c.t.VidmUploadUpgradeBundleTask - -- Starting :: vIDM upgrade bundle upload Task 2023-07-22 12:05:29.563 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- VidmInstallTaskUtil.getLeasedLocalFileLocation :: Start 2023-07-22 12:05:29.565 INFO [pool-3-thread-26] c.v.v.l.c.s.ContentLeaseServiceImpl - -- Inside create content lease. 2023-07-22 12:05:29.566 INFO [pool-3-thread-26] c.v.v.l.c.s.ContentLeaseServiceImpl - -- Created lease for the folder with id :: 8a233ccd-ccf9-4ddb-adff-358c20c20fe1. 2023-07-22 12:05:29.577 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- /data/vm-config/symlinkdir/8a233ccd-ccf9-4ddb-adff-358c20c20fe1 2023-07-22 12:05:29.577 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- Started Downloading Content Repo 2023-07-22 12:05:29.577 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2023-07-22 12:05:29.578 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /productBinariesRepo 2023-07-22 12:05:29.578 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /vidm/3.3.7/upgrade/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:05:29.578 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- URL :: /productBinariesRepo/vidm/3.3.7/upgrade/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:05:29.578 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /productBinariesRepo/vidm/3.3.7/upgrade/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:05:29.581 INFO [pool-3-thread-26] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='da06426f-5f18-4985-b640-38d711fd597f', version=8.1.0.0} -> repoName='productBinariesRepo', contentState='PUBLISHED', url='/productBinariesRepo/vidm/3.3.7/upgrade/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz'} 2023-07-22 12:05:29.582 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- Completed Downloading from Content Repo and starting InputStream 2023-07-22 12:05:29.582 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- /data/vm-config/symlinkdir/8a233ccd-ccf9-4ddb-adff-358c20c20fe1/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:09:56.628 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- Output stream completed * * * * 2023-07-22 12:09:56.637 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- VidmInstallTaskUtil.getLeasedLocalFileLocation upgradeFileLocation : /data/vm-config/symlinkdir/8a233ccd-ccf9-4ddb-adff-358c20c20fe1/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:09:56.649 INFO [pool-3-thread-26] c.v.v.l.v.c.t.VidmUploadUpgradeBundleTask - -- Started :: Copying upgrade bundle to vIDM with hostname: vidmone.cap.org 2023-07-22 12:09:56.652 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property vidmMinDiskspaceFor333UpgradeInKb value is obtained from Config Service : 7340032 2023-07-22 12:09:56.653 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property vidmOptMinDiskspaceFor333UpgradeInKb value is obtained from Config Service : 7 2023-07-22 12:09:56.654 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property vidmExpandDiskspaceFor333UpgradeKb value is obtained from Config Service : https://kb.vmware.com/s/article/81220 2023-07-22 12:09:56.655 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- Checking if required free space of 7340032 GB is available in /var or / partiton in vIDM host vidmone.cap.org 2023-07-22 12:09:56.750 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:09:56.751 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:09:56.751 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command: df | grep -w "/" | awk '{print $4}' 2023-07-22 12:09:56.751 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:09:57.340 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:09:57.340 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:09:57.343 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "8693676\n", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:09:57.343 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- Partition / has disk space of 8 GB 2023-07-22 12:09:57.343 INFO [pool-3-thread-26] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- Partition / has required disk space of 7340032 GB to store vIDM upgrade bundle 2023-07-22 12:09:57.345 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- vIDM ENDPOINT HOST :: vidmone.cap.org 2023-07-22 12:09:57.345 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- COMMAND :: mkdir -p /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/ 2023-07-22 12:09:57.444 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:09:57.445 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:09:57.445 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command: mkdir -p /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/ 2023-07-22 12:09:57.446 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:09:58.528 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:09:58.528 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:09:58.529 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Command Status code :: 0 2023-07-22 12:09:58.529 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:09:58.530 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Output Stream :: 2023-07-22 12:09:58.530 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:09:58.530 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- 2023-07-22 12:09:58.530 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:09:58.530 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Error Stream :: 2023-07-22 12:09:58.531 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:09:58.532 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- null 2023-07-22 12:09:58.532 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:09:58.534 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Found file for upload: /data/vm-config/symlinkdir/8a233ccd-ccf9-4ddb-adff-358c20c20fe1/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:09:58.735 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Uploading file --> ssh://root@vidmone.cap.org/var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/ * * * 2023-07-22 12:15:05.647 INFO [pool-3-thread-26] c.v.v.l.v.c.t.VidmUploadUpgradeBundleTask - -- Completed :: Copying upgrade bundle to vIDM host: vidmone.cap.org 2023-07-22 12:15:05.647 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- vIDM ENDPOINT HOST :: vidmone.cap.org 2023-07-22 12:15:05.647 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- COMMAND :: chmod 644 /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz && chown root:KXKXKXKX /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:15:06.264 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:15:06.266 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:15:06.266 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command: chmod 644 /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz && chown root:KXKXKXKX /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz 2023-07-22 12:15:06.266 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Command Status code :: 0 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Output Stream :: 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- 2023-07-22 12:15:07.356 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:15:07.357 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- Error Stream :: 2023-07-22 12:15:07.357 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:15:07.357 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- null 2023-07-22 12:15:07.357 INFO [pool-3-thread-26] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:15:07.357 INFO [pool-3-thread-26] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmUpgradeBundleUploadCompletion Starts vIDM offline upgrade task 2023-07-22 12:15:07.961 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Starting :: vIDM offline upgrade Task 2023-07-22 12:15:07.962 INFO [pool-3-thread-27] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property vidmUpgradeBasePath value is obtained from Config Service : /usr/local/horizon/update/ 2023-07-22 12:15:07.963 INFO [pool-3-thread-27] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property timeoutCount value is obtained from Config Service : 45 2023-07-22 12:15:07.963 INFO [pool-3-thread-27] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property serverAliveCountMax value is obtained from Config Service : 3 2023-07-22 12:15:07.964 INFO [pool-3-thread-27] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property serverAliveInterval value is obtained from Config Service : 60000 2023-07-22 12:15:07.964 INFO [pool-3-thread-27] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property timeoutSleep value is obtained from Config Service : 60000 2023-07-22 12:15:07.965 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Version to upgrade: 3.3.7 source version :: 3.3.6.0 2023-07-22 12:15:07.965 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Offline Upgrade to version 3.3.3 and above. 2023-07-22 12:15:08.090 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:15:08.097 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:15:08.098 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: touch /usr/local/horizon/conf/flags/elasticsearchReindexed.flag 2023-07-22 12:15:08.099 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:15:39.128 INFO [http-nio-8080-exec-5] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-22 12:16:08.184 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:16:08.191 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:16:08.193 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- output content : 2023-07-22 12:16:08.321 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:16:08.324 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:16:08.324 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: rm -rf /db/elasticsearch/horizon 2023-07-22 12:16:08.325 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:16:40.145 INFO [http-nio-8080-exec-3] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-22 12:17:08.420 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:17:08.427 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:17:08.428 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- output content : 2023-07-22 12:17:08.526 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:17:08.527 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:17:08.527 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: echo "vidmone.cap.org" > /usr/local/horizon/conf/flags/opensearch.masternode 2023-07-22 12:17:08.528 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:17:41.119 INFO [http-nio-8080-exec-9] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-22 12:18:08.612 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:18:08.619 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:18:08.619 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- output content : 2023-07-22 12:18:08.620 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Manifest cmd formed :: /usr/local/horizon/update/configureupdate.hzn manifest --set-version 3.3.6.0 2023-07-22 12:18:08.824 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:18:08.826 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:18:08.827 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: /usr/local/horizon/update/configureupdate.hzn manifest --set-version 3.3.6.0 2023-07-22 12:18:08.827 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:18:42.131 INFO [http-nio-8080-exec-3] c.v.v.l.s.n.s.NotificationServiceImpl - -- Authentication object is not null org.springframework.security.authentication.UsernamePasswordAuthenticationToken@fe908ae6: YXYXYXYX org.springframework.security.core.userdetails.User@c220133a: Username: admin@local; Password: YXYXYXYX Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: LCM_ADMIN; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@957e: RemoteIpAddress: 127.0.0.1; SessionId: null; Granted Authorities: LCM_ADMIN 2023-07-22 12:18:46.551 INFO [http-nio-8080-exec-9] c.v.v.l.a.c.AuthznCustomObjectMapper - -- ConfigParamsDTO : ConfigParamsDTO [configname=lcm.vidm.registered, configvalue=true] 2023-07-22 12:18:46.553 INFO [http-nio-8080-exec-9] c.v.v.l.a.c.AuthznCustomObjectMapper - -- ConfigParamsDTO : ConfigParamsDTO [configname=lcm.firstboot.admin.password.changed, configvalue=true] 2023-07-22 12:19:08.920 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Setting backrev version in manifest-installed.xml to: 3.3.6.0 2023-07-22 12:19:08.928 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:19:08.928 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:19:08.929 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- output content : Setting backrev version in manifest-installed.xml to: 3.3.6.0 2023-07-22 12:19:08.929 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Fetching upgrade binaries from /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/ 2023-07-22 12:19:08.930 INFO [pool-3-thread-27] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Creating upgrade script - /opt/vmware/horizon/vrslcm_vidm_upgrade.sh on vIDM host - vidmone.cap.org 2023-07-22 12:19:09.022 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:19:09.023 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:19:09.024 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: echo "/usr/local/horizon/update/updateoffline.hzn -f /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo-lcm.tar.gz echo \$? > /opt/vmware/horizon/vidmUpgradeCommandExitCode.log" > /opt/vmware/horizon/vrslcm_vidm_upgrade.sh && chmod 777 /opt/vmware/horizon/vrslcm_vidm_upgrade.sh 2023-07-22 12:19:09.024 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:19:09.611 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:19:09.612 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:19:09.612 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 0 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Command output stream :: 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:19:09.613 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Invoking upgrade script using the command :: nohup /opt/vmware/horizon/vrslcm_vidm_upgrade.sh > /opt/vmware/horizon/vrslcm_vidm_upgrade.log 2>&1 & 2023-07-22 12:19:09.614 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- vIDM ENDPOINT HOST :: vidmone.cap.org 2023-07-22 12:19:09.614 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- COMMAND :: nohup /opt/vmware/horizon/vrslcm_vidm_upgrade.sh > /opt/vmware/horizon/vrslcm_vidm_upgrade.log 2>&1 & 2023-07-22 12:19:09.697 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:19:09.698 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:19:09.698 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command: nohup /opt/vmware/horizon/vrslcm_vidm_upgrade.sh > /opt/vmware/horizon/vrslcm_vidm_upgrade.log 2>&1 & 2023-07-22 12:19:09.698 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:20:09.815 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:20:09.822 INFO [pool-3-thread-27] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:20:09.823 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Command Status code :: 0 2023-07-22 12:20:09.823 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:20:09.823 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Output Stream :: 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- Error Stream :: 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:20:09.824 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- null 2023-07-22 12:20:09.825 INFO [pool-3-thread-27] c.v.v.l.v.d.h.VidmUtil - -- ==================================================== 2023-07-22 12:20:09.825 INFO [pool-3-thread-27] c.v.v.l.v.c.t.VidmOfflineUpgradeTask - -- Upgrade initiated successfully. 2023-07-22 12:20:09.825 INFO [pool-3-thread-27] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmOfflineUpgradeCompletion vRSLCM waits for uprgade to complete. There are 45 retries. So we expect upgrade to be completed 2023-07-22 12:20:10.366 INFO [pool-3-thread-25] c.v.v.l.v.c.t.VidmRebootTask - -- Starting :: vIDM reboot Task 2023-07-22 12:20:10.367 INFO [pool-3-thread-25] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property timeoutSleep value is obtained from Config Service : 60000 2023-07-22 12:20:10.368 INFO [pool-3-thread-25] c.v.v.l.v.c.t.u.VidmUpgradeTaskUtil - -- vIDM Configuration property timeoutCount value is obtained from Config Service : 45 2023-07-22 12:20:10.368 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Checking vIDM upgrade status on vIDM host - vidmone.cap.org 2023-07-22 12:20:10.456 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:20:10.460 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:20:10.461 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: ls /opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:20:10.462 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:20:11.004 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 2 2023-07-22 12:20:11.004 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:20:11.005 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 2, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:20:11.005 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 2 2023-07-22 12:20:11.005 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:20:11.006 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream :: 2023-07-22 12:20:11.006 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:20:11.006 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:20:11.006 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Waiting for upgrade to complete, sleeping for 60000 milliseconds 2023-07-22 12:21:11.006 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Number of retries left to check vIDM upgrade status on the node - vidmone.cap.org : 44 2023-07-22 12:21:11.164 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:21:11.170 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:21:11.171 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: ls /opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:21:11.172 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:21:11.715 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 2 2023-07-22 12:21:11.716 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:21:11.716 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 2, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:21:11.717 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 2 2023-07-22 12:21:11.718 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:21:11.718 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream :: 2023-07-22 12:21:11.718 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:21:11.718 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:21:11.718 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Waiting for upgrade to complete, sleeping for 60000 milliseconds 2023-07-22 12:22:11.718 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Number of retries left to check vIDM upgrade status on the node - vidmone.cap.org : 43 vIDM upgrade completes finally. 2023-07-22 12:25:13.798 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Number of retries left to check vIDM upgrade status on the node - vidmone.cap.org : 40 2023-07-22 12:25:13.889 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:25:13.889 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:25:13.889 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: ls /opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:25:13.890 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:25:14.439 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 2 2023-07-22 12:25:14.440 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:25:14.440 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 2, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:25:14.440 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 2 2023-07-22 12:25:14.441 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:25:14.441 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream :: 2023-07-22 12:25:14.441 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:25:14.441 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:25:14.441 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Waiting for upgrade to complete, sleeping for 60000 milliseconds 2023-07-22 12:26:14.441 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Number of retries left to check vIDM upgrade status on the node - vidmone.cap.org : 39 2023-07-22 12:26:14.526 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:26:14.527 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:14.527 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: ls /opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:26:14.527 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:15.075 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:26:15.076 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:26:15.076 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "/opt/vmware/horizon/vidmUpgradeCommandExitCode.log\n", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 0 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream ::/opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:26:15.077 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- File: /opt/vmware/horizon/vidmUpgradeCommandExitCode.log found 2023-07-22 12:26:15.294 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:26:15.301 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:15.302 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: cat /opt/vmware/horizon/vidmUpgradeCommandExitCode.log 2023-07-22 12:26:15.302 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:15.848 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:26:15.848 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:26:15.849 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "0\n", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:26:15.849 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 0 2023-07-22 12:26:15.850 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:15.850 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream ::0 2023-07-22 12:26:15.850 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:15.850 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:26:15.850 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Upgrade command exit status obtained from /opt/vmware/horizon/vidmUpgradeCommandExitCode.log is 0. vIDM upgrade completed successfully 2023-07-22 12:26:15.851 INFO [pool-3-thread-25] c.v.v.l.v.c.u.VidmFileChangeUtil - -- Number of retries left to check vIDM upgrade status on the node - vidmone.cap.org : 38 2023-07-22 12:26:15.930 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Executing command on the host: vidmone.cap.org , as user: root:KXKXKXKX 2023-07-22 12:26:15.930 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:15.930 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command: find /usr/local/horizon/conf -iname update.success 2023-07-22 12:26:15.930 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- ------------------------------------------------------ 2023-07-22 12:26:16.479 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- exit-status: 0 2023-07-22 12:26:16.480 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2023-07-22 12:26:16.480 INFO [pool-3-thread-25] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "/usr/local/horizon/conf/update.success\n", "errorData" : null, "commandTimedOut" : false } 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command exit code :: 0 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command output stream ::/usr/local/horizon/conf/update.success 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- ============================================== 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.d.h.VidmUtil - -- Command error stream :: null 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.c.t.VidmRebootTask - -- File: /usr/local/horizon/conf/update.success found. Upgrade completed successfully! 2023-07-22 12:26:16.481 INFO [pool-3-thread-25] c.v.v.l.v.c.t.VidmRebootTask - -- vIDM upgrade command exit status: 0. vIDM upgrade completed successfully Takes around 11 minutes on the primary node There are some post uprgade actions performed and then it goes ahead with further steps VMware Identity Manager Upgrade logs Reference: /opt/vmware/horizon/vrslcm_vidm_upgrade.log /***** Identifies that the upgrade. isthrough vRSLCM *****/ Upgrade via LCM dualbootupdate-3.3.7.0-21173100.tar.gz identity-manager-3.3.7.0-21173100-updaterepo.zip Archive: /var/tmp/2cd58d3a-c41b-4109-8a00-dec658ce9a6b/identity-manager-3.3.7.0-21173100-updaterepo.zip creating: manifest/ inflating: manifest/manifest-latest.xml inflating: manifest/manifest-repo.xml inflating: manifest/manifest-latest.xml.sha256 extracting: manifest/manifest-latest.xml.sig creating: package-pool/ inflating: package-pool/tar-1.30-5.ph3.x86_64.rpm inflating: package-pool/procps-ng-3.3.15-3.ph3.x86_64.rpm inflating: package-pool/irqbalance-1.4.0-5.ph3.x86_64.rpm inflating: package-pool/iptables-1.8.3-6.ph3.x86_64.rpm inflating: package-pool/device-mapper-event-2.02.187-2.ph3.x86_64.rpm inflating: package-pool/python3-prettytable-0.7.2-6.ph3.noarch.rpm inflating: package-pool/lsscsi-0.30-1.ph3.x86_64.rpm inflating: package-pool/python2-libs-2.7.17-7.ph3.x86_64.rpm inflating: package-pool/python3-xml-3.7.5-23.ph3.x86_64.rpm inflating: package-pool/python3-six-1.12.0-1.ph3.noarch.rpm inflating: package-pool/svadmin-rpm-3.3.7.0-21173100.noarch.rpm inflating: package-pool/python3-requests-2.24.0-1.ph3.noarch.rpm inflating: package-pool/libmnl-1.0.4-4.ph3.x86_64.rpm inflating: package-pool/cracklib-2.9.7-2.ph3.x86_64.rpm inflating: package-pool/sudo-1.9.5-5.ph3.x86_64.rpm inflating: package-pool/bzip2-1.0.8-2.ph3.x86_64.rpm inflating: package-pool/bc-1.07.1-3.ph3.x86_64.rpm inflating: package-pool/filesystem-1.1-4.ph3.x86_64.rpm inflating: package-pool/sed-4.5-2.ph3.x86_64.rpm * * * * inflating: package-pool/horizon-database-rpm-3.3.7.0-21173100.noarch.rpm inflating: package-pool/libcap-2.25-8.ph3.x86_64.rpm inflating: package-pool/python3-lxml-4.9.1-1.ph3.x86_64.rpm inflating: package-pool/python2-2.7.17-7.ph3.x86_64.rpm inflating: package-pool/horizon-preconfig-standard-3.3.7.0-21173100.noarch.rpm inflating: package-pool/libuv-1.34.2-2.ph3.x86_64.rpm The product RID matches so continue Continuing... Server is up, running the update root 18068 8.0 0.0 19112 14548 ? S 12:19 0:00 python -m SimpleHTTPServer 8008 root 18072 0.0 0.0 5484 1820 ? S 12:19 0:00 grep SimpleHTTPServer Checking website Website found, validating RID Website validated Installing custom provider-runtime tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified Updating update rpm to 3.3.7.0 3.3.7.0 tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified tput: No value for $TERM and no -T specified Getting the Current version... Checking Space FREESPACE 4715420 KB Checking for updates... /dualbootupdate.tar.gz Current version: 3.3.6.0 Updating all VMs to version: 3.3.7.0 Update started, current version : 3.3.6.0 Time : Sat Jul 22 12:20:01 UTC 2023 %_pkgverify_level digest %_pkgverify_level none Running preupdate Saving manifest.xml This is an LCM deployment. Assuming that the /usr/local/horizon/update/reindexingIndices.hzn script is already run and proceeding... Skipping managenodes.inc in case of Upgrade via LCM LCM added master hostname before upgrade : vidmone.cap.org You are required to change your password immediately (password expired) su: Authentication token is no longer valid; new one required (Ignored) ERROR: KDC configuration not yet initialized. Use the init subcommand. You are required to change your password immediately (password expired) su: Authentication token is no longer valid; new one required (Ignored) horizon-elasticsearch : normal uninstall horizon-elasticsearch : normal uninstall iNode usage on /var at % runSqlVerification 12:20:07.608 [main] INFO com.vmware.horizon.utils.KeystoreUtilities - Failed to load non-fips bc provider, keystore conversions will not be available. 12:20:08.283 [main] DEBUG com.vmware.horizon.utils.KeystoreUtilities - Loading BCFKS stream from BCFIPS SUCCESS: Connection verified for driver version: 42.2.12 Updating the vm. Saving manifest.xml Installing version - 3.3.7.0 Build 21173100 ..................................................Connection to sfcbd lost Attempting to reconnect: 1 .....................................................Waiting on upgrade process to complete: Vami upgrade process completed. Restoring default provider-runtime %_pkgverify_level digest %_pkgverify_level digest Running postupdate 0 false Updating IDM CA keystore with JRE certificates Import command completed: 151 entries successfully imported, 0 entries failed or cancelled Configuring opensearch.yml file : vidmone.cap.org Resetting RabbitMQ Upgrade provisioning adapters %d [%thread] %-5level %logger - %msg%n%d [%thread] %-5level %logger - %msg%nError : Error creating bean with name 'scopedTarget.cacheService' defined in com.vmware.horizon.cache.config.CacheServiceConfig: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.vmware.horizon.cache.CacheService]: Factory method 'getCacheService' threw exception; nested exception is org.hibernate.exception.GenericJDBCException: could not prepare statement renamed '/usr/local/horizon/conf/iptables.cfg.rpmsave' -> '/usr/local/horizon/conf/iptables.cfg' <79>Jul 22 12:25:30 root: updateiptables.hzn Discovering IP Address Unable to retrieve IP list, using current ip list No current ip list defined, using own ip <79>Jul 22 12:25:30 root: updateiptables.hzn Processing port roles <79>Jul 22 12:25:31 root: updateiptables.hzn Setup iptables locally <79>Jul 22 12:25:31 root: setupIPTables.hzn begin iptable setup <79>Jul 22 12:25:31 root: setupIPTables.hzn allow any ip to access tcp ports: 22 443 5432 6443 7443 80 8080 8443 9000 9300 9694 9898 9999 <79>Jul 22 12:25:32 root: setupIPTables.hzn vApp access only to tcp ports: 40002 40003 5701 9300 <79>Jul 22 12:25:32 root: setupIPTables.hzn finish with static rules <79>Jul 22 12:25:33 root: setupIPTables.hzn save firewall rules <79>Jul 22 12:25:33 root: setupIPTables.hzn end iptable setup <79>Jul 22 12:25:33 root: updateiptables.hzn finish processing iptables Rebuilding manifest file Update completed, current version : 3.3.7.0 Time : Sat Jul 22 12:25:38 UTC 2023 Update complete, please reboot the VM. Important Logs during Upgrade

  • What happens when i download a Product Binary? Where does it store?

    VMware Aria Suite Lifecycle allows us to download product binaries once we have "My VMware Account " mapped into it. What really happens in the backend? Where does it store the product binary? In short , when a download is triggered and mapping is done , there is a content id generated. This content id is something like the one shown below 2023-07-14 03:46:27.764 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Created content with vmid :: ced3c5b4-1cdc-4107-899c-5261e949c67b. Once it's created for this specific binary download it is stored under /data/vm-config/vmrepo/productBinariesRepo/ Are you interested to know in depth , then let's jump into it. We will understand the whole process by looking into the logs The moment we click on ADD for any product binary,it can be an upgrade or install there is a validation done and a request to request service is generated ****** ADD is clicked here ****** 2023-07-14 03:32:56.415 INFO [http-nio-8080-exec-4] c.v.v.l.l.u.SettingInputValidationUtil - -- Product validation status : true 2023-07-14 03:32:56.415 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- Validation result for Setting :myvmwaredownload result true 2023-07-14 03:32:56.416 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- Setting Value before save: "" ****** Request to request service is generated ****** 2023-07-14 03:32:56.418 INFO [http-nio-8080-exec-4] c.v.v.l.l.u.RequestSubmissionUtil - -- ++++++++++++++++++ Creating request to Request_Service :::>>> { "vmid" : "9a42d32b-8866-47ab-ba5b-e3bb4e119393", "transactionId" : null, "tenant" : "default", "requestName" : "myvmwaredownload", "requestReason" : "VROPS 8.12.1 Type Install - My VMware Product Binary Download", "requestType" : "MYVMWARE_DOWNLOAD", "requestSource" : null, "requestSourceType" : "user", "inputMap" : { "productId" : "vrops", "productVersion" : "8.12.1", "productBinaryType" : "Install", "productBinaryPath" : null, "componentName" : null, "mappingType" : null, "productName" : "VMware Aria Operations", "requestId" : null, "removeBinary" : null }, "outputMap" : { }, "state" : "CREATED", "executionId" : null, "executionPath" : null, "executionStatus" : null, "errorCause" : null, "resultSet" : null, "isCancelEnabled" : null, "lastUpdatedOn" : 1689305576418, "createdBy" : null } As we can see from the request above, I am trying to download VMware Aria Operations 8.12.1 Install binary It identifies how many requests to be processed. There is an engine request created and it's set to IN PROGRESS 2023-07-14 03:32:56.432 INFO [http-nio-8080-exec-4] c.v.v.l.l.u.SettingsHelper - -- MyVMware download requestIds : 9a42d32b-8866-47ab-ba5b-e3bb4e119393 2023-07-14 03:32:56.830 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- Processing request with ID : 9a42d32b-8866-47ab-ba5b-e3bb4e119393 with request type MYVMWARE_DOWNLOAD with request state INPROGRESS. The statemachine is now kicked in 2023-07-14 03:32:57.911 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- Injected OnStart Edge for the Machine ID :: myvmwaredownload Validates MY VMware credentials 2023-07-14 03:32:58.007 INFO [pool-3-thread-20] c.v.v.l.p.c.m.t.StartGenericMyVMwareCredentialsTask - -- Starting :: Start Validating the MyVMware Credentials Starts downloading binaries after fetching the token for the user mapped under MY VMware account 2023-07-14 03:32:58.563 INFO [pool-3-thread-25] c.v.v.l.p.c.m.t.MyVmwareDownloadTask - -- Starting :: Download VMware Aria Product Binaries 2023-07-14 03:32:58.565 INFO [pool-3-thread-25] c.v.v.l.p.c.m.t.MyVmwareDownloadTask - -- Trying to get access token with user arun@arunnukula.com 2023-07-14 03:32:58.566 INFO [pool-3-thread-25] c.v.v.l.c.a.InternalOnlyApiAspect - -- Internal Only Check for: execution(ResponseEntity com.vmware.vrealize.lcm.locker.controller.CredentialController.getPassword(String)) 2023-07-14 03:32:58.568 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token - Started 2023-07-14 03:32:58.569 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Myvmware base URL: https://apigw.vmware.com 2023-07-14 03:32:58.569 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token. Current try number : '1' 2023-07-14 03:32:59.964 INFO [pool-3-thread-25] c.v.v.l.u.RestHelper - -- Status code : 200 2023-07-14 03:32:59.965 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token - Finished 2023-07-14 03:32:59.966 INFO [pool-3-thread-25] c.v.v.l.p.c.m.t.MyVmwareDownloadTask - -- Successfully logged into my vmware with account username Checksum validation is done 2023-07-14 03:32:59.971 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2023-07-14 03:32:59.971 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /checksumRepo 2023-07-14 03:32:59.972 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /vrops/8.12.1/checksum.json 2023-07-14 03:32:59.972 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentDownloadController - -- URL :: /checksumRepo/vrops/8.12.1/checksum.json 2023-07-14 03:32:59.972 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /checksumRepo/vrops/8.12.1/checksum.json The initial phase of the download process begins 2023-07-14 03:33:12.263 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- *** Download requested VMware Aria Product : [VROPS-8.12.1]: OPERATIONS-8121 2023-07-14 03:33:12.263 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token - Started 2023-07-14 03:33:12.264 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Myvmware base URL: https://apigw.vmware.com Download group is identified 2023-07-14 03:45:30.871 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Download group to be checked OPERATIONS-8121 against OPERATIONS-8121. 2023-07-14 03:45:30.871 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- *** Binary file NAME match happened for : 'vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' 2023-07-14 03:45:30.872 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- *** Binary file NAME match happened for : 'vRealize_Operations_Manager_With_CP-8.x-to-8.12.1.21952629.pak' 2023-07-14 03:45:30.872 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- *** Binary file NAME match happened for : 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' Starts writing the binary to disk 2023-07-14 03:45:34.357 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Successfully created the metadata file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.metadata ' with content '886d6d4ce3ff995f1faa24f00e9aec86e2574477'. 2023-07-14 03:45:34.357 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Starting download of 'vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' to '/data/myvmware/vrops/8.12.1/install' ... 2023-07-14 03:45:34.393 INFO [pool-3-thread-25] c.v.v.l.u.RestHelper - -- Status code : 200 2023-07-14 03:45:34.393 INFO [pool-3-thread-25] c.v.v.l.u.DownloadHelper - -- Started writing input stream to the file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova'. 2023-07-14 03:46:27.764 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Created content with vmid :: ced3c5b4-1cdc-4107-899c-5261e949c67b. 2023-07-14 03:46:27.764 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- SOURCE LOCATION :: /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova 2023-07-14 03:46:27.764 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- content.getRepositoryIdentifier() :: productBinariesRepo 2023-07-14 03:46:27.766 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- DESTINATION LOCATION :: productBinariesRepo 2023-07-14 03:46:27.768 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- DESTINATION FOLDER LOCATION :: /data/vm-config/vmrepo/productBinariesRepo/ce/ced3c5b4-1cdc-4107-899c-5261e949c67b Download is now complete and uploaded to content repo 2023-07-14 03:51:02.014 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Content uploaded successfully in the content repo 2023-07-14 03:51:02.016 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Download product binary file 'vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' - Finished. 2023-07-14 03:51:02.028 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Verifying if binary file 'vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' of 'VROPS-8.12.1' is fully downloaded. 2023-07-14 03:51:02.151 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Validate Checksum disabled false 2023-07-14 03:51:02.152 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- >> PROCESSING 1 of 1. File Name : vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova 2023-07-14 03:51:02.152 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Starting :: checksum matching for : /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova 2023-07-14 03:51:02.155 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract directory for source file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' is '/data/tempextract/9aeb0d09-be87-446e-bd33-14679eab6738/'. 2023-07-14 03:51:02.156 INFO [pool-3-thread-25] c.v.v.l.u.ShellExecutor - -- Executing shell command: tar -xf /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova -C /data/tempextract/9aeb0d0 9-be87-446e-bd33-14679eab6738/ --wildcards *.mf 2023-07-14 03:51:02.156 INFO [pool-3-thread-25] c.v.v.l.u.ProcessUtil - -- Execute tar 2023-07-14 03:51:02.291 INFO [pool-3-thread-25] c.v.v.l.u.ShellExecutor - -- Result: []. 2023-07-14 03:51:02.292 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' to '/data/tempextract/9aeb0d09-be87-446e-bd 33-14679eab6738/' success. 2023-07-14 03:51:02.293 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Binary file NAME match happened for : /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova 2023-07-14 03:51:02.295 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Adding result record for :: vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova 2023-07-14 03:51:02.296 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- ProductId: vrops The temporary extract directory is deleted 2023-07-14 03:51:02.296 INFO [pool-3-thread-25] c.v.v.l.u.f.FileUtil - -- Directory is deleted : /data/tempextract/9aeb0d09-be87-446e-bd33-14679eab6738 2023-07-14 03:51:02.300 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract directory '/data/tempextract/9aeb0d09-be87-446e-bd33-14679eab6738/' for source file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova' is deleted. 2023-07-14 03:51:02.300 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Checksum MATCHED for file vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova of VROPS-8.12.1. Along with VMware Aria Operations 8.12.1,we have the cloud proxy to be downloaded as well 2023-07-14 03:51:02.385 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get product download url for 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' - Started. 2023-07-14 03:51:04.855 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get product download url for 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' - Finished. 2023-07-14 03:51:04.856 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- HOST of final download URL :: https://download2.vmware.com 2023-07-14 03:51:04.856 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- SHA1 Checksum check for file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' with checksum value '193444c7b424f67ab1a46939e4f1a3d7aa6b85af ' - started 2023-07-14 03:51:04.858 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- SHA1 Checksum check for file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' with checksum value '193444c7b424f67ab1a46939e4f1a3d7aa6b85af ' - Finished 2023-07-14 03:51:04.858 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Starting download of product binary file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova'. Size is : '1.65 GB'. 2023-07-14 03:51:04.859 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Download product binary file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' - Started. 2023-07-14 03:51:04.860 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Download product binary file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova'.Current try number : '1' 2023-07-14 03:51:04.860 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token - Started 2023-07-14 03:51:04.861 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Myvmware base URL: https://apigw.vmware.com 2023-07-14 03:51:04.862 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token. Current try number : '1' 2023-07-14 03:51:06.201 INFO [pool-3-thread-25] c.v.v.l.u.RestHelper - -- Status code : 200 2023-07-14 03:51:06.273 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Get myvmware access token - Finished 2023-07-14 03:51:06.275 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Successfully created the metadata file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.metadata' with content '193444c7b424f67ab1a46939e4f1a3d7aa6b85af'. 2023-07-14 03:51:06.277 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Starting download of 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' to '/data/myvmware/vrops/8.12.1/install' ... Starts writing the download to the specific location 2023-07-14 03:51:06.275 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Successfully created the metadata file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.metadata' with content '193444c7b424f67ab1a46939e4f1a3d7aa6b85af'. 2023-07-14 03:51:06.277 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Starting download of 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' to '/data/myvmware/vrops/8.12.1/install' ... 2023-07-14 03:51:06.363 INFO [pool-3-thread-25] c.v.v.l.u.RestHelper - -- Status code : 200 2023-07-14 03:51:06.363 INFO [pool-3-thread-25] c.v.v.l.u.DownloadHelper - -- Started writing input stream to the file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova'. 2023-07-14 03:51:40.944 INFO [pool-3-thread-25] c.v.v.l.u.DownloadHelper - -- Finished writing input stream to the file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova'. 2023-07-14 03:51:41.000 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Writing file to content repo with filename vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova. 2023-07-14 03:51:41.003 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Uploading product binary with filename cloudproxy.ova to content repo. 2023-07-14 03:51:41.003 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Content delete requested for url /productBinariesRepo/vrops/8.12.1/install/cloudproxy.ova. 2023-07-14 03:51:41.520 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Content delete is failed with given URL :: /productBinariesRepo/vrops/8.12.1/install/cloudproxy.ova 2023-07-14 03:51:41.521 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Creating content operation. 2023-07-14 03:51:41.532 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- URL :: /productBinariesRepo/vrops/8.12.1/install/cloudproxy.ova 2023-07-14 03:51:41.536 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- URL :: /productBinariesRepo/vrops/8.12.1/install/cloudproxy.ova 2023-07-14 03:51:41.537 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH LENGTH :: 6 2023-07-14 03:51:41.537 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH LENGTH TEST PASSED 2023-07-14 03:51:41.539 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> productBinariesRepo :: NAME -> __ROOT__ :KXKXKXKX PARENT -> 367eb197-9850-4013-b66e-e4c4077b3263 2023-07-14 03:51:41.550 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH :: 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::productBinariesRepo 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::vrops 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::8.12.1 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::install 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::cloudproxy.ova 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- ADDING NODE - PATH LENGTH :: 6 2023-07-14 03:51:41.551 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> productBinariesRepo :: PARENT -> a530119f-d933-4391-9d62-6fe8d40d57f3 :: NAME -> vrops 2023-07-14 03:51:41.554 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> productBinariesRepo :: PARENT -> 45ce3af0-0325-4439-af4d-3160366d8bfd :: NAME -> 8.12.1 2023-07-14 03:51:41.557 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> productBinariesRepo :: PARENT -> d3e25c69-45db-4e05-b87c-cbf4aacbe24e :: NAME -> install 2023-07-14 03:51:41.559 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> productBinariesRepo :: PARENT -> fbc115f0-dc23-4e8b-87a7-5d057f51cfec :: NAME -> cloudproxy.ova 2023-07-14 03:51:41.562 INFO [pool-3-thread-25] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- ADDING NODE :: 5 2023-07-14 03:51:41.570 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Created content with vmid :: 3a051861-9b68-43a2-aa65-71abc24aae59. 2023-07-14 03:51:41.570 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- SOURCE LOCATION :: /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova 2023-07-14 03:51:41.571 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- content.getRepositoryIdentifier() :: productBinariesRepo 2023-07-14 03:51:41.572 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- DESTINATION LOCATION :: productBinariesRepo 2023-07-14 03:51:41.573 INFO [pool-3-thread-25] c.v.v.l.c.c.FileContentDatabase - -- DESTINATION FOLDER LOCATION :: /data/vm-config/vmrepo/productBinariesRepo/3a/3a051861-9b68-43a2-aa65-71abc24aae59 2023-07-14 03:51:43.267 INFO [http-nio-8080-exec-2] c.v.v.l.d.i.u.InventoryWriteUtil - -- QUERY MAP LENGTH :: 3 2023-07-14 03:51:43.271 INFO [http-nio-8080-exec-2] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: ProductInventory 2023-07-14 03:51:43.592 INFO [http-nio-8080-exec-3] c.v.v.l.d.i.u.InventoryWriteUtil - -- QUERY MAP LENGTH :: 3 2023-07-14 03:51:43.592 INFO [http-nio-8080-exec-3] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: ProductInventory 2023-07-14 03:51:46.066 INFO [pool-3-thread-25] c.v.v.l.c.c.ContentRepositoryController - -- Content uploaded successfully in the content repo 2023-07-14 03:51:46.066 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadRestClient - -- Download product binary file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' - Finished. 2023-07-14 03:51:46.067 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Verifying if binary file 'vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' of 'VROPS-8.12.1' is fully downloaded. 2023-07-14 03:51:46.069 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Validate Checksum disabled false 2023-07-14 03:51:46.069 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- >> PROCESSING 1 of 1. File Name : vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova 2023-07-14 03:51:46.069 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Starting :: checksum matching for : /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova 2023-07-14 03:51:46.069 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract directory for source file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' is '/data/tempextract/ b2c7eb4f-e3cc-49bc-b0e8-225d4a9c08ec/'. 2023-07-14 03:51:46.231 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova' to '/data/tempextract/b2c7eb4f-e3cc-49bc-b0e8-225 d4a9c08ec/' success. 2023-07-14 03:51:46.231 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Binary file NAME match happened for : /data/myvmware/vrops/8.12.1/install/vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova 2023-07-14 03:51:46.233 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Adding result record for :: vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova 2023-07-14 03:51:46.235 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- ProductId: vrops 2023-07-14 03:51:46.235 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- File match found 2023-07-14 03:51:46.235 INFO [pool-3-thread-25] c.v.v.l.u.f.FileUtil - -- Directory is deleted : /data/tempextract/b2c7eb4f-e3cc-49bc-b0e8-225d4a9c08ec 2023-07-14 03:51:46.240 INFO [pool-3-thread-25] c.v.v.l.d.s.h.SourceMappingUtil - -- Extract directory '/data/tempextract/b2c7eb4f-e3cc-49bc-b0e8-225d4a9c08ec/' for source file '/data/myvmware/vrops/8.12.1/install/vRealize-Operations-Clo ud-Proxy-8.12.1.21952631_OVF10.ova' is deleted. 2023-07-14 03:51:46.240 INFO [pool-3-thread-25] c.v.v.l.d.m.h.MyVmwareDownloadUtil - -- Checksum MATCHED for file vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova of VROPS-8.12.1. 2023-07-14 03:51:46.270 INFO [pool-3-thread-25] c.v.v.l.p.c.m.t.MyVmwareDownloadTask - -- Source mapping results for myvmware download is ::: [ { "productId" : "vrops", "productVersion" : "8.12.1", "productBinaryType" : "install", "productBinaryPath" : "vRealize-Operations-Manager-Appliance-8.12.1.21952151_OVF10.ova", "componentName" : "default", "mappingType" : "MY VMware", "productName" : "VMware Aria Operations", "requestId" : null, "removeBinary" : null }, { "productId" : "vrops", "productVersion" : "8.12.1", "productBinaryType" : "install", "productBinaryPath" : "vRealize-Operations-Cloud-Proxy-8.12.1.21952631_OVF10.ova", "componentName" : "cloudproxy", "mappingType" : "MY VMware", "productName" : "VMware Aria Operations", "requestId" : null, "removeBinary" : null } ]

  • Why should I upgrade my infrastructure and management apps frequently?

    Upgrading infrastructure and management applications regularly offers several benefits: 1. Security: Newer versions often include security patches and fixes, protecting your systems against vulnerabilities and reducing the risk of cyberattacks or data breaches. 2. Performance: Upgrades often bring performance improvements, optimizing resource utilization, enhancing speed, and enabling your infrastructure to handle increased workloads efficiently. 3. Compatibility: As technology advances, software and hardware requirements change. Upgrading ensures compatibility with newer technologies, preventing compatibility issues and enabling seamless integration with other systems. 4. Features and Functionality: Upgrades frequently introduce new features, functionality, and enhancements that can improve productivity, streamline operations, and provide a competitive edge. Staying up to date allows you to take advantage of these advancements. 5. Support and Maintenance: Software vendors typically provide ongoing support and maintenance for the latest versions, ensuring you have access to assistance when needed. Older versions may have limited support or no longer receive updates. 6. Cost Savings: Maintaining outdated infrastructure and applications can become costly due to increased downtime, security risks, and inefficient processes. Regular upgrades can help minimize these risks and potential expenses. By keeping your infrastructure and management applications up to date, you can leverage the latest technology, enhance security, improve performance, and optimize your overall operations, ultimately benefiting your organization in the long run. VMware Aria Suite Lifecycle provides simplified lifecycle management through streamlined deployment,configuration,patching,upgrades,configuration management and decommissioning process. Suite Lifecycle (formerly vRSLCM) automates most of the repetitive tasks which avoids human errors. So, If you haven't started exploring VMware Aria Suite Lifecycle , it's right time to begin...... 👍

bottom of page