top of page

Experienced Technology Product Manager adept at steering success throughout the entire product lifecycle, from conceptualization to market delivery. Proficient in market analysis, strategic planning, and effective team leadership, utilizing data-driven approaches for ongoing enhancements.

  • Twitter
  • LinkedIn
White Background

VMware Aria Cloud Extensibility Proxy | Deploy and Upgrade | Runbook |


 

The cloud extensibility proxy is a virtual appliance (VA) used in the configuration of the on-premises extensibility action integrations and VMware Aria Automation Orchestrator 8.x integrations in Automation Assembler.


 

Download PDF version of this blog with all of the screenshots of snippets below


Deploy & Upgrade Cloud Extensibility Proxy Deep Dive Run book
.pdf
Download PDF • 11.51MB


 

Fetching and Creating API token locker object


  • As a prerequisite, we need to fetch the refresh key and store it in our VASL's locker

  • Logged into my organization



  • Click on My Account and to create a new refresh token


  • Click on Generate a New Token to get a new one



  • Enter the token name and the roles you would like to assign , in this case i've assigned everything as it's a demo



  • That's your token



  • One can copy the token and store it in a safe location


n*************************R


  • Now come back to VASL and then create a locker entry for the refresh token



  • Enter the refresh token and create a locker object



  • Remember , this locker object for refresh token will be used during deployment of cloud extensibility proxy



 

Deployment of Cloud Extensibility Proxy

  • Login into VASL and then click on VMware Aria Cloud



  • Click on "Create a Cloud Proxy"



  • Entered details like environment name and password to be used during deployment



  • Select Cloud Extensibility Proxy tile



  • Accept EULA



  • Enter infrastructure details



  • Enter Network Details



  • I did create a DNS entry for Cloud Extensibility proxy , we will be using that fqdn and ip during deployment



  • In this pane , the proxy name is an identifier on how you want to see you cloud extensitiblity proxy to be referred. Select the product password which would be the one to login into appliance. Other one is the refresh token we created before.



  • Click on next to run a precheck



  • All prehceck sare successful




  • Verify



  • Submit deployment request



  • Various phases are as below


1. Validate Environment Data 
2. Infra Pre-Validation
3. Cloud Proxy Validations
4. Deploy Cloud Extensibility Proxy
	Start 
	Start Cloud Proxy Generic
	Get Orgniazation id for Cloud Proxy Deployment
	Get Binary for Cloud Proxy Deployment
	Download Binary for Cloud Proxy Deployment 
	Get OTK for  Cloud Proxy Deployment  
	Validate Deployment Inputs
	Deploy OVF
	Power On Virtual Machine
	Check Guest Tools Status
	Check Hostname/IP status
	Final


5. Update Environment Details
6. Schedule Notifications
	

  • Deployment is now complete




  • Some important log snippets from VASL


***** Binary Location Task *****


2023-06-23 03:39:14.655 INFO  [pool-3-thread-27] c.v.v.l.c.c.t.CloudProxyGetBinaryLocationTask -  -- Start cloud proxy get binary location task
2023-06-23 03:39:14.662 INFO  [pool-3-thread-27] c.v.v.l.c.c.t.CloudProxyGetBinaryLocationTask -  -- Getting binary location for abxcloudproxy
2023-06-23 03:39:14.662 INFO  [pool-3-thread-27] c.v.v.l.c.d.r.u.CloudProxyServerRestUtil -  -- Request URL : https://api.mgmt.cloud.vmware.com/api/artifact-provider?artifact=cexp-data-collector
2023-06-23 03:39:14.667 INFO  [pool-3-thread-27] c.v.v.l.c.d.r.c.CloudProxyRestClient -  -- https://api.mgmt.cloud.vmware.com/api/artifact-provider?artifact=cexp-data-collector
2023-06-23 03:39:14.668 INFO  [pool-3-thread-27] c.v.v.l.c.d.r.c.CloudProxyRestClient -  -- Connecting without Proxy
2023-06-23 03:39:15.674 INFO  [pool-3-thread-27] c.v.v.l.c.d.r.c.CloudProxyRestClient -  -- API Response Status : 200 Response Message : {"artifact":"cexp-data-collector","providerUrl":"https://vro-appliance-distrib.s3.amazonaws.com/VMware-Extensibility-Appliance-SAAS.ova","latestOvaVersion":"7.6"}
2023-06-23 03:39:15.679 INFO  [pool-3-thread-27] c.v.v.l.c.c.t.CloudProxyGetBinaryLocationTask -  -- Binary location retrieved: https://vro-appliance-distrib.s3.amazonaws.com/VMware-Extensibility-Appliance-SAAS.ova
2023-06-23 03:39:15.680 INFO  [pool-3-thread-27] c.v.v.l.p.a.s.Task -  -- Injecting Edge :: OnCloudProxyGetBinaryLocationSuccess




***** Binary Download Task *****


2023-06-23 03:42:35.408 INFO  [pool-3-thread-10] c.v.v.l.u.DownloadHelper -  -- Finished writing input stream to the file '/data/cloudproxy/abxcloudproxy/VMware-Extensibility-Appliance-SAAS.ova'.
2023-06-23 03:42:35.411 INFO  [pool-3-thread-10] c.v.v.l.c.d.r.u.CloudProxyServerRestUtil -  -- file download successful
2023-06-23 03:42:40.411 INFO  [pool-3-thread-10] c.v.v.l.c.c.t.CloudProxyDownloadBinaryTask -  -- Cloud proxy binary downloaded to location /data/cloudproxy/abxcloudproxy/VMware-Extensibility-Appliance-SAAS.ova
2023-06-23 03:42:40.413 INFO  [pool-3-thread-10] c.v.v.l.p.a.s.Task -  -- Injecting Edge :: OnCloudProxyDownloadBinarySuccess




***** Fetching OTK or One Time Key *****


2023-06-23 03:42:41.097 INFO  [pool-3-thread-38] c.v.v.l.c.c.t.CloudProxyGetOneTimeKeyTask -  -- Start cloud proxy get OTK YXYXYXYX


2023-06-23 03:42:41.104 INFO  [pool-3-thread-38] c.v.v.l.c.d.r.u.FlexCommonUtils -  -- cspUrl : https://console.cloud.vmware.com/csp/gateway/am/api/auth/api-tokens/authorize
2023-06-23 03:42:41.573 INFO  [pool-3-thread-38] c.v.v.l.u.RestHelper -  -- Status code : 200
2023-06-23 03:42:41.575 INFO  [pool-3-thread-38] c.v.v.l.c.d.r.u.FlexCommonUtils -  -- {"statusCode":200,"responseMessage":null,"outputData":"{\"id_token\":\"eyJhbGciOiJSUzI1**********N2TqgHQDhAugCe2cQ\",\"token_type\":\"bearer\",\"expires_in\":1799,\"scope\":\"ALL_PERMISSIONS
 customer_number openid group_ids group_names\",\"access_token\":\"eyJhbGciOiJSUzI1NiI****YBTSn-FuoKRbUg\",\"refresh_token\":\"n-aeMxqH9M5****mSRYCoWBp4vgllvfsR\"}","token":null,"contentLength":-1,"allHeaders":null}
2023-06-23 03:42:42.834 INFO  [scheduling-1] c.v.v.l.a.c.EventProcessor -  -- Responding for Edge :: OnCloudProxyGetOtkSuccess




***** Deploy OVF Task *****


2023-06-23 03:42:44.164 INFO  [pool-3-thread-7] c.v.v.l.p.c.v.t.DeployOvfTask -  -- Starting :: DeployOvfTask
2023-06-23 03:42:44.414 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- Getting OvfManager from VC ServiceContent
2023-06-23 03:42:44.419 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- Cluster provided is: Singapore#CAP
2023-06-23 03:42:44.439 INFO  [pool-3-thread-7] c.v.v.l.d.v.v.u.CoreUtility -  -- Found the cluster in the Datacenter
2023-06-23 03:42:44.448 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- Setting host xx.xx.xx.xx for import spec creation
2023-06-23 03:42:44.449 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- Creating Import result


2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.p.c.v.t.DeployOvfTask -  -- {"name":null,"description":null,"vcServerUrl":"https://vc.cap.org/sdk","vcServerUsername":"vrsvc@cap.org","vcServerPassword":"JXJXJXJX"}
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- Importing OVf
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- ##########################################################
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- OvfFileItem
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- chunkSize: null
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- create: false
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- deviceId: /cexproxy/VirtualLsiLogicController0:0
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- path: Prelude_Extensibility_VA-8.12.1.31024-21715470-system.vmdk
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- size: 1783000064
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- ##########################################################
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- ##########################################################
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- OvfFileItem
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- chunkSize: null
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- create: false
2023-06-23 03:42:44.767 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- deviceId: /cexproxy/VirtualLsiLogicController0:1
2023-06-23 03:42:44.768 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- path: Prelude_Extensibility_VA-8.12.1.31024-21715470-data.vmdk
2023-06-23 03:42:44.768 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- size: 17189376
2023-06-23 03:42:44.768 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- ##########################################################
2023-06-23 03:42:44.768 INFO  [pool-3-thread-7] c.v.v.l.d.v.d.i.BaseOvfDeploy -  -- ##########################################################
2023-06-23 03:42:45.535 INFO  [Thread-1642328] c.v.v.l.d.v.d.i.OvfDeployLocal -  -- completed percent uploaded------------------------------------------------------------------------------------------>0
2023-06-23 03:45:45.595 INFO  [Thread-1642328] c.v.v.l.d.v.d.i.OvfDeployLocal -  -- completed percent uploaded------------------------------------------------------------------------------------------>86
2023-06-23 03:46:10.999 INFO  [pool-3-thread-7] c.v.v.l.p.c.v.t.DeployOvfTask -  -- OVF deployment completed successfully. Will be proceeding with post deployment process
2023-06-23 03:46:11.000 INFO  [Thread-1642328] c.v.v.l.d.v.d.i.OvfDeployLocal -  -- ********************** Thread interrupted *******************
2023-06-23 03:46:11.007 INFO  [pool-3-thread-7] c.v.v.l.p.c.v.t.DeployOvfTask -  -- Found VM : cexproxy .proceeding further
2023-06-23 03:46:11.008 INFO  [pool-3-thread-7] c.v.v.l.p.c.v.t.DeployOvfTask -  -- upgrade_vm ?null
2023-06-23 03:46:11.008 INFO  [pool-3-thread-7] c.v.v.l.p.a.s.Task -  -- Injecting Edge :: OnSuccessfulOvfDeployment




***** CEXP deployment completed *****


2023-06-23 03:47:08.356 INFO  [scheduling-1] c.v.v.l.r.c.RequestProcessor -  -- Updating the Environment request status to COMPLETED for environment : CEXP with request ID : 232ad7e7-7e5a-4946-9742-ac38bc51db95 and request type : VALIDATE_AND_CR
EATE_ENVIRONMENT.

  • Going back to VASL , I do see my CEXP deployed



  • If we go to CAS and then click on VMware Aria Automation , we can see our CEXP available for consumption





  • List of Pods which run inside Cloud Extensibility Proxy


root@cexproxy [ /services-logs/prelude ]# kubectl get pods -A
NAMESPACE     NAME                                            READY   STATUS      RESTARTS   AGE
ingress       ingress-ctl-traefik-6b9fc769fc-4kxs4            1/1     Running     0          15m
kube-system   command-executor-p8rgl                          1/1     Running     0          18m
kube-system   coredns-jpb4w                                   1/1     Running     0          18m
kube-system   health-reporting-app-6q8r6                      1/1     Running     0          18m
kube-system   kube-apiserver-cexproxy.cap.org                 1/1     Running     0          18m
kube-system   kube-controller-manager-cexproxy.cap.org        1/1     Running     0          18m
kube-system   kube-flannel-ds-tplcv                           1/1     Running     0          18m
kube-system   kube-node-monitor-mwc56                         1/1     Running     0          18m
kube-system   kube-proxy-4lkwc                                1/1     Running     0          18m
kube-system   kube-scheduler-cexproxy.cap.org                 1/1     Running     0          18m
kube-system   kubelet-rubber-stamp-r47x8                      1/1     Running     0          18m
kube-system   metrics-server-5psc6                            1/1     Running     0          18m
kube-system   network-health-monitor-25cgn                    1/1     Running     0          18m
kube-system   predictable-pod-scheduler-thxd8                 1/1     Running     0          18m
kube-system   prelude-network-monitor-cron-1687492980-szm8w   0/1     Completed   0          3m48s
kube-system   prelude-network-monitor-cron-1687493160-lgtsh   0/1     Completed   0          47s
kube-system   state-enforcement-cron-1687492920-k6v4r         0/1     Completed   0          4m48s
kube-system   state-enforcement-cron-1687493040-vjvn4         0/1     Completed   0          2m48s
kube-system   state-enforcement-cron-1687493160-zvv9z         0/1     Completed   0          47s
kube-system   update-etc-hosts-gzlwr                          1/1     Running     0          18m
prelude       orchestration-ui-app-67bd95c5c9-4f2mm           1/1     Running     0          4m5s
prelude       postgres-0                                      1/1     Running     0          15m
prelude       proxy-service-5d7564fbf8-5ktzk                  1/1     Running     0          15m
prelude       vco-app-57dd775776-lvscf                        3/3     Running     0          8m8s


 

Upgrading Cloud Extensibility Proxy



Before performing an upgrade verified the version 
root@cexproxy [ /services-logs/prelude ]# vracli version
Version - 8.12.1.31024 Build 21715470
Description - Aria Automation Extensibility Appliance 05/2023

  • Accoridng to patch portal , this is the latest one



  • Download the iso and upload it to one of the datastores on vCenter where this Cloud Extensibility proxy resides

  • Took a snapshot before making any changes



  • Map the downloaded iso to the appliance






  • Now that the iso is mapped , let's find out the id for CD rom using command blkid


root@cexproxy [ ~ ]# blkid
/dev/sr0: UUID="2023-06-19-02-30-47-00" LABEL="CDROM" TYPE="iso9660"
/dev/sda1: PARTUUID="b0fed661-3a90-4b48-a0db-e3f74915c76f"
/dev/sda2: UUID="6050b84c-a9bb-4387-baef-d43f2b7b9c75" TYPE="ext3" PARTUUID="e907703a-dba2-44ea-b22c-c63a2ccf6a47"
/dev/sda3: UUID="9fbb9c60-bbaf-4491-bd91-7caa2e44c852" TYPE="swap" PARTUUID="a3b288b9-1a4f-433c-9a1d-fd85afabddff"
/dev/sda4: UUID="6b8a710d-390f-483f-9d62-db66a2ce6429" TYPE="ext4" PARTUUID="16886689-6979-4fab-b2b8-ddb1488e11d1"
/dev/sdc: UUID="uIoR7Q-0Hr5-HjWx-6iOq-cZN7-3VLw-ahJAPH" TYPE="LVM2_member"
/dev/sdb: UUID="rPeD9M-lw0t-dIMu-socb-FxDj-Q3J3-EinsF4" TYPE="LVM2_member"
/dev/mapper/data_vg-data: UUID="4aa2662d-c4f0-41fd-b004-de88e244969e" TYPE="ext4"
/dev/sdd: UUID="vqX6KJ-wCi4-Fc4T-A0cP-4lCP-0tmY-0eoa0j" TYPE="LVM2_member"
/dev/mapper/logs_vg-log: UUID="9c1b3c0a-68fb-4348-8ee1-9948449c7c6c" TYPE="ext4"
/dev/mapper/home_vg-home: UUID="7617d195-007f-49c7-99fc-d01751e54693" TYPE="ext4"

  • Mount the CD-ROM drive , based on the id shown above. In our case it is /dev/sr0

mount /dev/sr0 /mnt/cdrom




root@cexproxy [ ~ ]# mount /dev/sr0 /mnt/cdrom
mount: /mnt/cdrom: WARNING: device write-protected, mounted read-only.
root@cexproxy [ ~ ]#


  • Back up your cloud extensibility proxy by taking a virtual machine (VM) snapshot. We've already done that.

  • To initiate the upgrade, run the vracli upgrade exec -y --repo cdrom://command


vracli upgrade exec -y --repo cdrom://




root@cexproxy [ ~ ]# vracli upgrade exec -y --repo cdrom://
Loading update bootstrap.
Update bootstrap loaded successfully.
Saving configuration parameters
Running procedures in background...
.......................................
Upgrade procedure started in background.
During upgrade, downtime of the services and restarts of the VAs are expected.
Use 'vracli upgrade status --follow' to monitor the progress.
... Preparing for upgrade .....................................
...............................................................
Running health check before upgrade for nodes and pods.
Health check before upgrade for nodes and pods passed successfully.
...............................................................
Configuring SSH channels.
SSH channels configured successfully.
...............................................................
Checking version of current platform and services.
Version check of current platform and services passed successfully.
...............................................................
Running infrastructure health check before upgrade.
Infrastructure health check before upgrade passed successfully.
...............................................................
Configuring upgrade repository.
Upgrade repository configuration completed successfully.
...............................................................
Saving restore point for artifacts. This might take more than an hour.
Restore point for artifacts saved successfully.
...............................................................
Shutting down services.
Services shut down successfully.
...............................................................
Saving system configurations.
System configuration saved successfully.
...............................................................
Saving restore point for data. This might take more than an hour.
Restore point for data saved successfully.
... Upgrade preparation completed successfully. ...
... Executing upgrade .........................................
...............................................................
Running additional health check before upgrade for nodes.
Health check before upgrade for nodes passed successfully.
...............................................................
Starting upgrade monitor.
Upgrade monitor started successfully.
...............................................................
Deactivating cluster of appliance nodes. This might take several minutes.
Cluster deactivation of appliance nodes skipped.
...............................................................
Configuring upgrade.
Upgrade is in progress. This might take more than an hour to complete and the system might be rebooted several times.
**
*When the VAMI upgrade exists , the session will be closed , once rebooted , services deployment would start 

  • After a while once services are deployed upgrade is completed



  • It all starts with the CD Rom mapping and executing the uprgade command

  • The first thing which occurs is mapping of repository


[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Downloading manifest ...
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Manifest downloaded successfully.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Validating manifest ...
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Manifest signature validated successfully.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Repository product ID matched successfully.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Manifest validated successfully.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Searching for update bootstrap RPM ...
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Update bootstrap RPM found.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Update bootstrap RPM downloaded successfully.
[INFO][2023-06-23 05:39:47][cexproxy.cap.org] Applying the bootstrap system ...

  • Then the next sequence of events are triggered

    • Bootstrap is applied

    • Version is retrieved

    • Installables list built successfully

    • Upgrade plan is created

    • Node status is verified

    • Infrastructure health check is performed

    • Update repository set successfully


[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Bootstrap system applied successfully.
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Retrieving product version...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Product version retrieved successfully.
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/patch-template ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/rel830 ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/rel840 ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/rel842 ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/rel881 ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Searching for bootstrap package for product version 8.12.1.31024 in /var/vmware/prelude/upgrade/bootstrap/rel882 ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Selecting installables ...
[INFO][2023-06-23 05:39:50][cexproxy.cap.org] Building installables list ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Installables list built successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Installables selected successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Creating upgrade plan ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Creating upgrade configaration from installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Aggregating upgrade configuration of all installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Upgrade configuration of all installables aggregated successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Computing version dependent upgrade configuration of installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Version dependent upgrade configuration of installables computed sucessfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Reducing upgrade configuration of installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Upgrade configuration of installables reduced successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Adjusting upgrade configuration of installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Upgrade configuration of installables adjusted successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Determining the effective upgrade configuration of installables ...
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Effective upgrade configuration of installables determined successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Upgrade configuration from installables created successfully.
[INFO][2023-06-23 05:39:51][cexproxy.cap.org] Upgrade plan created successfully.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Retrieving nodes status...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Nodes status retrieval succeeded.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Processing nodes data...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Nodes data processing succeeded.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Verifying nodes...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Nodes verification completed successfully.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Retrieving services status...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Services status retrieval succeeded.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Processing services data...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Services data processing succeeded.
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Verifying services...
[INFO][2023-06-23 05:39:53][cexproxy.cap.org] Services verification completed successfully.
[INFO][2023-06-23 05:39:54][cexproxy.cap.org] Creating SSH key pairs
[INFO][2023-06-23 05:39:55][cexproxy.cap.org] SSH key pairs created successfully.
[INFO][2023-06-23 05:39:55][cexproxy.cap.org] Configuring SSH on nodes.
[INFO][2023-06-23 05:39:56][cexproxy.cap.org] Service sshd status: active/enabled
[INFO][2023-06-23 05:39:56][cexproxy.cap.org] SSH port 22 is open.
[INFO][2023-06-23 05:39:56][cexproxy.cap.org] SSH configurated on nodes successfully.
[INFO][2023-06-23 05:39:56][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/ssh-noop.sh at host: cexproxy.cap.org
[INFO][2023-06-23 05:40:05][cexproxy.cap.org] Remote command succeeded: /opt/scripts/upgrade/ssh-noop.sh at host: cexproxy.cap.org
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added 'cexproxy.cap.org,10.109.45.58' (ED25519) to the list of known hosts.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:40:05][cexproxy.cap.org] Verifying remote nodes are able to connect to one another and to this node.
[INFO][2023-06-23 05:40:06][cexproxy.cap.org] Verification that nodes are able to connect to one another and to this node succeeded.
[INFO][2023-06-23 05:40:07][cexproxy.cap.org] Retriving product versions on all nodes.
[INFO][2023-06-23 05:40:07][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/vami-save-vers.sh prep at host: cexproxy.cap.org
[INFO][2023-06-23 05:40:17][cexproxy.cap.org] Retrieving product version.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:40:18][cexproxy.cap.org] Product version retrieved successfully.
[INFO][2023-06-23 05:40:18][cexproxy.cap.org] Remote command succeeded: /opt/scripts/upgrade/vami-save-vers.sh prep at host: cexproxy.cap.org
[INFO][2023-06-23 05:40:18][cexproxy.cap.org] Product versions successfully retrieved from all nodes.
[INFO][2023-06-23 05:40:18][cexproxy.cap.org] Checking that product versions match across all nodes.
[INFO][2023-06-23 05:40:18][cexproxy.cap.org] Product versions match across all nodes verified.
[INFO][2023-06-23 05:40:19][cexproxy.cap.org] Checking appliance nodes infrastructure health
[INFO][2023-06-23 05:40:19][cexproxy.cap.org] Running remote command: /opt/health/run-once.sh health at host: cexproxy.cap.org
[INFO][2023-06-23 05:40:42][cexproxy.cap.org] Remote command succeeded: /opt/health/run-once.sh health at host: cexproxy.cap.org
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:40:42][cexproxy.cap.org] Infrastructure health check passed on all appliance nodes.
[INFO][2023-06-23 05:40:43][cexproxy.cap.org] Setting up update repository on all nodes.
[INFO][2023-06-23 05:40:43][cexproxy.cap.org] Running remote command: vamicli update --repo cdrom:// at host: cexproxy.cap.org
[INFO][2023-06-23 05:40:53][cexproxy.cap.org] Remote command succeeded: vamicli update --repo cdrom:// at host: cexproxy.cap.org
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:40:53][cexproxy.cap.org] Update repository set successfully on all nodes.
[INFO][2023-06-23 05:40:53][cexproxy.cap.org] Verifying the access to update repository on all nodes.
[INFO][2023-06-23 05:40:53][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/vami-config-repo.sh 'local' at host: cexproxy.cap.org
[INFO][2023-06-23 05:41:02][cexproxy.cap.org] Checking FIPS configuration.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023

  • Once the update repository is checked , the next action is shifted to vami again


23/06/2023 05:40:53 [INFO] Setting local repository address. url=cdrom://, username=, password=<empty>
23/06/2023 05:41:03 [INFO] Checking available updates. jobid=1
23/06/2023 05:41:03 [INFO] Using update repository found on CDROM device: /dev/sr0
23/06/2023 05:41:03 [INFO] Downloading latest manifest. jobId=1, url=file:///tmp/update_agent_cdrom_ugPpOU/update, username=, password=
23/06/2023 05:41:04 [INFO] Signature script output: Verified OK
23/06/2023 05:41:04 [INFO] Signature script output:
23/06/2023 05:41:04 [INFO] Manifest signature verification passed
23/06/2023 05:41:04 [INFO] New manifest. installed-version=8.12.1.31024, downloaded-version=8.12.2.31329

  • Restore points are now saved , Services are stopped and then preparation for vami uprgade is initiated


[INFO][2023-06-23 05:41:03][cexproxy.cap.org] Verifying the access to the update repository.
[INFO][2023-06-23 05:41:04][cexproxy.cap.org] Access to the update repository verified.
[INFO][2023-06-23 05:41:04][cexproxy.cap.org] Verifying availability of new product version in the repository.
[INFO][2023-06-23 05:41:04][cexproxy.cap.org] Availability of new version in the repository verified.
[INFO][2023-06-23 05:41:04][cexproxy.cap.org] Remote command succeeded: /opt/scripts/upgrade/vami-config-repo.sh 'local' at host: cexproxy.cap.org
[INFO][2023-06-23 05:41:04][cexproxy.cap.org] Update repository is verified successfully on all nodes.
[INFO][2023-06-23 05:41:05][cexproxy.cap.org] Saving restore points on all nodes.
[INFO][2023-06-23 05:41:05][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-save.sh 'local' immutable-artifact artifacts-lastworking at host: cexproxy.cap.org
[INFO][2023-06-23 05:41:20][cexproxy.cap.org] Saving local restore points.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Saving restore point for /opt/charts .
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Restore point for /opt/charts saved successfully.
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Verifying source and destination checksums for restore point /opt/charts
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Source and destination checksums for restore point /opt/charts matched successfully.
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Selecting docker images to save in local restore point.
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Docker images successfully selected to save in local restore point.
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Saving docker images in local restore point.
[INFO][2023-06-23 05:41:21][cexproxy.cap.org] Saving docker image coredns_private latest e9cdb7735889
[INFO][2023-06-23 05:41:22][cexproxy.cap.org] Saving docker image db-image14_private 8.12.1.31024 0390266e909c
[INFO][2023-06-23 05:41:25][cexproxy.cap.org] Saving docker image flannel_private latest 243c95edf4b0
[INFO][2023-06-23 05:41:26][cexproxy.cap.org] Saving docker image health_private latest 762744863022
[INFO][2023-06-23 05:41:27][cexproxy.cap.org] Saving docker image metrics-server_private latest b5d05b47245b
[INFO][2023-06-23 05:41:28][cexproxy.cap.org] Saving docker image network-health-monitor_private latest b6a651793b11
[INFO][2023-06-23 05:41:28][cexproxy.cap.org] Saving docker image nginx-httpd_private latest 06816bc1a534
[INFO][2023-06-23 05:41:29][cexproxy.cap.org] Saving docker image scripting-runtime_private latest c2de02927d4c
[INFO][2023-06-23 05:41:32][cexproxy.cap.org] Saving docker image squid-container_private 8.12.1.30661 d7cc0c97b8a8
[INFO][2023-06-23 05:41:34][cexproxy.cap.org] Saving docker image squid-container_private latest d7cc0c97b8a8
[INFO][2023-06-23 05:41:35][cexproxy.cap.org] Saving docker image traefik-ingress-controller_private 8.12.1.31024 0d300bea6394
[INFO][2023-06-23 05:41:36][cexproxy.cap.org] Saving docker image wavefront-proxy_private 8.12.1.31024 5cc0f36861be
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Docker images saved successully in local restore point.
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Saving restore point images catalog
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Restore point images catalog saved successfully.
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Saving restore point for /opt/vmware/prelude/metadata .
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Restore point for /opt/vmware/prelude/metadata saved successfully.
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Verifying source and destination checksums for restore point /opt/vmware/prelude/metadata
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Source and destination checksums for restore point /opt/vmware/prelude/metadata matched successfully.
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Local restore points saved successfully.
[INFO][2023-06-23 05:41:45][cexproxy.cap.org] Restore points on all nodes saved successfully.
[INFO][2023-06-23 05:41:46][cexproxy.cap.org] Shutting down application services
[INFO][2023-06-23 05:43:52][cexproxy.cap.org] Application services shut down successfully.
[INFO][2023-06-23 05:43:52][cexproxy.cap.org] Shutting down infrastructure services
[INFO][2023-06-23 05:46:05][cexproxy.cap.org] Infrastructure services shut down successfully.
[INFO][2023-06-23 05:46:06][cexproxy.cap.org] Saving restore points on all nodes.
[INFO][2023-06-23 05:46:06][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-save.sh 'local' sys-config sys-config at host: cexproxy.cap.org
[INFO][2023-06-23 05:46:21][cexproxy.cap.org] Saving local restore points.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:46:21][cexproxy.cap.org] Activating LCC maintenance mode.
[INFO][2023-06-23 05:46:22][cexproxy.cap.org] LCC maintenance mode activated successfully.
[INFO][2023-06-23 05:46:22][cexproxy.cap.org] Saving local restore point for Kubernetes object prelude-vaconfig.
[INFO][2023-06-23 05:46:22][cexproxy.cap.org] Local restore point for Kubernetes object prelude-vaconfig saved successfully.
[INFO][2023-06-23 05:46:22][cexproxy.cap.org] Local restore points saved successfully.
[INFO][2023-06-23 05:46:22][cexproxy.cap.org] Restore points on all nodes saved successfully.
[INFO][2023-06-23 05:46:23][cexproxy.cap.org] Saving restore points on all nodes.
[INFO][2023-06-23 05:46:23][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-save.sh 'local' live-data live-data at host: cexproxy.cap.org
[INFO][2023-06-23 05:46:38][cexproxy.cap.org] Saving local restore points.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:46:38][cexproxy.cap.org] Saving restore point for /data/db/live .
[INFO][2023-06-23 05:46:38][cexproxy.cap.org] Restore point for /data/db/live saved successfully.
[INFO][2023-06-23 05:46:38][cexproxy.cap.org] Verifying source and destination checksums for restore point /data/db/live
[INFO][2023-06-23 05:46:45][cexproxy.cap.org] Source and destination checksums for restore point /data/db/live matched successfully.
[INFO][2023-06-23 05:46:45][cexproxy.cap.org] Saving restore point for /data/vco .
[INFO][2023-06-23 05:46:46][cexproxy.cap.org] Restore point for /data/vco saved successfully.
[INFO][2023-06-23 05:46:46][cexproxy.cap.org] Verifying source and destination checksums for restore point /data/vco
[INFO][2023-06-23 05:47:03][cexproxy.cap.org] Source and destination checksums for restore point /data/vco matched successfully.
[INFO][2023-06-23 05:47:03][cexproxy.cap.org] Local restore points saved successfully.
[INFO][2023-06-23 05:47:03][cexproxy.cap.org] Restore points on all nodes saved successfully.
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Retrieving nodes status...
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Nodes status retrieval succeeded.
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Processing nodes data...
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Nodes data processing succeeded.
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Verifying nodes...
[INFO][2023-06-23 05:47:06][cexproxy.cap.org] Nodes verification completed successfully.
[INFO][2023-06-23 05:47:07][cexproxy.cap.org] Activating local monitors on all nodes.
[INFO][2023-06-23 05:47:07][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/mon-activate.sh at host: cexproxy.cap.org
[INFO][2023-06-23 05:47:17][cexproxy.cap.org] Activating upgrade monitor on the node
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 05/2023
[INFO][2023-06-23 05:47:17][cexproxy.cap.org] Upgrade monitor activated successfully on the node.
[INFO][2023-06-23 05:47:17][cexproxy.cap.org] Remote command succeeded: /opt/scripts/upgrade/mon-activate.sh at host: cexproxy.cap.org
[INFO][2023-06-23 05:47:17][cexproxy.cap.org] Local monitors successfully activated on all nodes.

  • As a next step , we need to go ahead and check vami logs again


23/06/2023 05:49:12 [INFO] Installing updates. instanceid=VMware:VMware_8.12.2.31329, jobid=2
23/06/2023 05:49:12 [INFO] Using update repository found on CDROM device: /dev/sr0
23/06/2023 05:49:12 [INFO] Installing update. instanceId=VMware:VMware_8.12.2.31329, jobId=2, url=file:///tmp/update_agent_cdrom_PPEywF/update, username=, password=
23/06/2023 05:49:12 [INFO] Downloading and installing update packages
23/06/2023 05:49:12 [INFO] Signature script output: Verified OK
23/06/2023 05:49:12 [INFO] Signature script output:
23/06/2023 05:49:12 [INFO] Manifest signature verification passed
23/06/2023 05:49:12 [INFO] Creating /opt/vmware/var/lib/vami/update/data/update_progress.json
23/06/2023 05:49:12 [INFO] Downloading the following packages for update version 8.12.2.31329
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: audit  x86_64  (none)  2.8.5  25.ph3  /package-pool/audit-2.8.5-25.ph3.x86_64.rpm  rpm  452607  2fdf95616439cdd2064887aefe2fe26d1cc0dd6e
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: bootstrap-extensibility  noarch  (none)  8.12.2.31329  1  /package-pool/bootstrap-extensibility-8.12.2.31329-1.noarch.rpm  rpm  361379  be75189afedd82a3dbab10c491dee5b7971ef481
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: crd-tools  noarch  (none)  8.12.2.31329  1  /package-pool/crd-tools-8.12.2.31329-1.noarch.rpm  rpm  39246  8134056bbe1966a486b33bd5e1e09cb3c215159b
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: curl  x86_64  (none)  8.1.1  1.ph3  /package-pool/curl-8.1.1-1.ph3.x86_64.rpm  rpm  166073  e0c4eb1c445143f439c3e7d9ce69a629645f4881
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: curl-libs  x86_64  (none)  8.1.1  1.ph3  /package-pool/curl-libs-8.1.1-1.ph3.x86_64.rpm  rpm  340471  6427ba93d5b01517cfeb8d6a8b5b2b08f06fc52d
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: libuv  x86_64  (none)  1.34.2  3.ph3  /package-pool/libuv-1.34.2-3.ph3.x86_64.rpm  rpm  98392  76e2bc02b3270f9dc5ee79a68efc8393b1f91963
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: linux  x86_64  (none)  4.19.283  2.ph3  /package-pool/linux-4.19.283-2.ph3.x86_64.rpm  rpm  23920750  2f3e65df8b679f3c73d8d03ce606d5e43861c420
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: linux-hmacgen  x86_64  (none)  4.19.283  2.ph3  /package-pool/linux-hmacgen-4.19.283-2.ph3.x86_64.rpm  rpm  49845  a3d2880a685c75c6cf9e40dccdc4b417cade0a56
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: nss  x86_64  (none)  3.44  10.ph3  /package-pool/nss-3.44-10.ph3.x86_64.rpm  rpm  938828  6ff75c42717feba40d922a3a9f8ddeabc7698915
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: nss-libs  x86_64  (none)  3.44  10.ph3  /package-pool/nss-libs-3.44-10.ph3.x86_64.rpm  rpm  930098  b31032c37de21175e820119bbee383d012326f1a
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: open-vm-tools  x86_64  (none)  12.2.0  2.ph3  /package-pool/open-vm-tools-12.2.0-2.ph3.x86_64.rpm  rpm  1195114  941a315c53f5ac5ffcc9ff8bef061c18303ac629
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: openssl  x86_64  (none)  1.0.2zh  1.ph3  /package-pool/openssl-1.0.2zh-1.ph3.x86_64.rpm  rpm  2192646  1b442868bbc53bf4410deb7155e13d6ae2291b46
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: openssl-c_rehash  x86_64  (none)  1.0.2zh  1.ph3  /package-pool/openssl-c_rehash-1.0.2zh-1.ph3.x86_64.rpm  rpm  13878  844d112946530181b5f2fa36dc0b5e56eb21a3ac
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-deploy-base  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-deploy-base-8.12.2.31329-1.noarch.rpm  rpm  47756  3d2480f04422d7b627f85b6fe9930a68d890b6b5
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-deploy-common  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-deploy-common-8.12.2.31329-1.noarch.rpm  rpm  7530916  14b37cdb99f903c4e547ee4f3913af89d587fcae
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-etcd  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-etcd-8.12.2.31329-1.noarch.rpm  rpm  8008981  3bf0fa797c739db3d84924770ed1a9154925cc56
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-flannel  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-flannel-8.12.2.31329-1.noarch.rpm  rpm  3751  ecf07a7dc3d753bb294d27fc9dc48de903baa30b
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-health  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-health-8.12.2.31329-1.noarch.rpm  rpm  14045  ec9b85e2ce01832ed8b4148503b7b59fde1dec86
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-k8s-base  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-k8s-base-8.12.2.31329-1.noarch.rpm  rpm  46022708  53c2cc91d3bc7fe6e7a39e470a0bf3d556cd6ad2
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-k8s-config  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-k8s-config-8.12.2.31329-1.noarch.rpm  rpm  6990  c91ce8c42eea084f80b0fdcae4e455bfb39c483f
23/06/2023 05:49:12 [INFO] package UPDATE VERSION: prelude-k8s-runtime  noarch  (none)  8.12.2.31329  1  /package-pool/prelude-k8s-runtime-8.12.2.31329-1.noarch.rpm  rpm  11974880  9895104faa98e07ee4d0843d01d8b65375881f17
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-00415ba83db6c6520545c018d970a961f49e4678bd4f1f2c7ee52b164fb8e796  noarch  (none)  1  1  /package-pool/prelude-layer-00415ba83db6c6520545c018d970a961f49e4678bd4f1f2c7ee52b164fb8e796-1-1.noarch.rpm  rpm  5843224  840e78096de57d6e4a95db09a40b10a8df6b7bd1
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-0c56a2b135ef8e6f67d57ab24ab06dd400564839e4fe41e2785d7084c2f3a777  noarch  (none)  1  1  /package-pool/prelude-layer-0c56a2b135ef8e6f67d57ab24ab06dd400564839e4fe41e2785d7084c2f3a777-1-1.noarch.rpm  rpm  12696780  a96670c157192288e3e1c54d8a3bc2593a2f595e
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-1c1d194ce1da0096def3c9fff6f308fa5e36c4a4ed8b3722ec70adf9680770ec  noarch  (none)  1  1  /package-pool/prelude-layer-1c1d194ce1da0096def3c9fff6f308fa5e36c4a4ed8b3722ec70adf9680770ec-1-1.noarch.rpm  rpm  77327172  3437fbb3b087cc6c5a2a65a1aef8ca59b979f399
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-1fd0ebdf6ea14ee709222ade830f1646b260c810b96e74d167af479fd3ca531d  noarch  (none)  1  1  /package-pool/prelude-layer-1fd0ebdf6ea14ee709222ade830f1646b260c810b96e74d167af479fd3ca531d-1-1.noarch.rpm  rpm  92048  2569c80f9ff105168d0f964c259e6192dbb1746f
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-2b4e3dcc6c3b62909cb5a005d4d1dbb5ad9d198687a9a7d17724b6810e32239f  noarch  (none)  1  1  /package-pool/prelude-layer-2b4e3dcc6c3b62909cb5a005d4d1dbb5ad9d198687a9a7d17724b6810e32239f-1-1.noarch.rpm  rpm  37640636  114fd8a32e0f70258fcfffed80141b13f33a3d75
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-3e5b28fa5e2d2fe28a2cc09f24bbc0d71f8da23509a910eb926b9476ce0ed8d3  noarch  (none)  1  1  /package-pool/prelude-layer-3e5b28fa5e2d2fe28a2cc09f24bbc0d71f8da23509a910eb926b9476ce0ed8d3-1-1.noarch.rpm  rpm  21827248  688febc2b0691db5f402ef2e6720b67ce476dc0f
23/06/2023 05:49:12 [INFO] package NEW PACKAGE   : prelude-layer-435ddda7eb71d8c7220891fc92a06bbad44c87b6f3c54d14ee1e4f61b2f5b95c  noarch  (none)  1  1  /package-pool/prelude-layer-435ddda7eb71d8c7220891fc92a06bbad44c87b6f3c54d14ee1e4f61b2f5b95c-1-1.noarch.rpm  rpm  3444  cf1639bd88da2fe748f313035c80547a6e0a7cce
*
**
*
*


23/06/2023 05:49:52 [INFO] Reboot required when installing linux package
23/06/2023 05:49:52 [INFO] Created reboot required file
23/06/2023 05:49:52 [INFO] Update 8.12.2.31329 manifest is set to be installed
23/06/2023 05:49:52 [INFO] Using update post-install script
23/06/2023 05:49:52 [INFO] Running updatecli to install updates. command={ mkdir -p /usr/share/update-notifier ; ln -s /opt/vmware/share/vami/vami_notify_reboot_required /usr/share/update-notifier/notify-reboot-required ; /opt/vmware/share/vami/update/updatecli '/opt/vmware/var/lib/vami/update/data/job/2' '8.12.1.31024' '8.12.2.31329' ; mv /opt/vmware/var/lib/vami/update/data/update_progress.json /opt/vmware/var/lib/vami/update/data/.update_progress.json ; } >> /opt/vmware/var/log/vami/updatecli.log 2>&1 &
23/06/2023 05:49:53 [INFO] Installation running in the background
23/06/2023 05:56:40 [INFO] Reloading new configuration. file=/opt/vmware/var/lib/vami/update/provider/provider-deploy.xml
23/06/2023 05:57:47 [INFO] Moving next manifest to installed manifest

  • Now the actual uprgade has trigged and the focus would be on updatecli and upgrade-datetime.log


23/06/2023 05:49:52 [INFO] Starting Install
23/06/2023 05:49:52 [INFO] Update status: Starting Install
23/06/2023 05:49:52 [INFO] Update status: Running pre-install scripts
23/06/2023 05:49:52 [INFO] Running /opt/vmware/var/lib/vami/update/data/job/2/pre_install '8.12.1.31024' '8.12.2.31329'
JOBPATH /opt/vmware/var/lib/vami/update/data/job/2
total 168
drwxr-xr-x 2 root root   4096 Jun 23 05:49 .
drwxr-xr-x 3 root root   4096 Jun 23 05:49 ..
-rwx------ 1 root root    125 Jun 23 05:49 manifest_update
--w--w-r-- 1 root root 120169 Jun 23 05:49 manifest.xml
-rwx------ 1 root root    833 Jun 23 05:49 post_install
-rwx------ 1 root root   2759 Jun 23 05:49 pre_install
-rwx------ 1 root root  10606 Jun 23 05:49 run_command
-rw-r--r-- 1 root root     57 Jun 23 05:49 status
-rwx------ 1 root root  10613 Jun 23 05:49 test_command
+ preupdate=/var/vmware/prelude/upgrade/bootstrap/preupdate.sh
**
*


dracut: *** Installing kernel module dependencies ***
dracut: *** Installing kernel module dependencies done ***
dracut: *** Resolving executable dependencies ***
dracut: *** Resolving executable dependencies done***
dracut: *** Generating early-microcode cpio image ***
dracut: *** Store current command line parameters ***
dracut: Stored kernel commandline:
dracut: root=UUID=6b8a710d-390f-483f-9d62-db66a2ce6429 rootfstype=ext4 rootflags=rw,relatime
dracut: *** Creating image file '/boot/initrd.img-4.19.283-2.ph3' ***
dracut: *** Creating initramfs image file '/boot/initrd.img-4.19.283-2.ph3' done ***
23/06/2023 05:50:49 [INFO] Update status: Done package installation
23/06/2023 05:50:49 [INFO] Update status: Running post-install scripts
23/06/2023 05:50:49 [INFO] Running /opt/vmware/var/lib/vami/update/data/job/2/post_install '8.12.1.31024' '8.12.2.31329' 0
2023-06-23 05:50:49Z Main bootstrap postupdate started*
*
*
*
2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-00-restore-ovfenv.sh starting...
2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-00-restore-ovfenv.sh succeeded


=========================


2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-aa-set-fips-cexp.sh starting...
2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-aa-set-fips-cexp.sh succeeded


=========================


2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-clear-services.sh starting...
2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-clear-services.sh succeeded


=========================


2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-configure-dns.sh starting...
2023-06-23 05:50:49Z /etc/bootstrap/postupdate.d/00-configure-dns.sh succeeded*
*
*
***
+ chmod a+x /etc/bootstrap/everyboot.d/zz-zz-resume-upgrade.sh
+ export -f vami_reboot_background
+ echo 'Scheduling a post-upgrade reboot...'
Scheduling a post-upgrade reboot...
+ echo 'Post-upgrade reboot scheduled'
Post-upgrade reboot scheduled
+ nohup bash -c vami_reboot_background
+ exit 0
2023-06-23 05:56:39Z /etc/bootstrap/postupdate.d/99-10-handover-prelude succeeded


=========================


2023-06-23 05:56:39Z /etc/bootstrap/postupdate.d/README is not executable.
2023-06-23 05:56:39Z Main bootstrap postupdate done
23/06/2023 05:56:39 [INFO] Update status: Done post-install scripts
23/06/2023 05:56:39 [INFO] Update status: Running VMware tools reconfiguration
23/06/2023 05:56:39 [INFO] Running /opt/vmware/share/vami/vami_reconfigure_tools
vmware-toolbox-cmd is /bin/vmware-toolbox-cmd
vmtoolsd wrapper not required on this VM with systemd.
23/06/2023 05:56:39 [INFO] Update status: Done VMware tools reconfiguration
23/06/2023 05:56:39 [INFO] Update status: Running finalizing installation
23/06/2023 05:56:39 [INFO] Running /opt/vmware/var/lib/vami/update/data/job/2/manifest_update
23/06/2023 05:56:39 [INFO] Update status: Done finalizing installation
23/06/2023 05:56:39 [INFO] Update status: Update completed successfully
23/06/2023 05:56:39 [INFO] Install Finished


[INFO][2023-06-23 05:49:11][cexproxy.cap.org] Starting installation of updates.
[INFO][2023-06-23 05:49:11][cexproxy.cap.org] Update installation started successfully.
[INFO][2023-06-23 05:50:10][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 05:50:10][cexproxy.cap.org] VAMI upgrade in progress.
[INFO][2023-06-23 05:50:10][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 05:51:10][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 05:51:10][cexproxy.cap.org] VAMI upgrade in progress.
[INFO][2023-06-23 05:51:10][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 05:52:11][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 05:52:11][cexproxy.cap.org] VAMI upgrade in progress.
[INFO][2023-06-23 05:52:11][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 05:53:10][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 05:53:10][cexproxy.cap.org] VAMI upgrade in progress.
*
*
[INFO][2023-06-23 05:56:11][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 05:56:39][cexproxy.cap.org] Waiting for VAMI to exit ...
[INFO][2023-06-23 05:57:09][cexproxy.cap.org] Verifying VAMI overall upgrade result ...*
*
*
[INFO][2023-06-23 05:59:10][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 06:00:10][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 06:00:10][cexproxy.cap.org] VAMI upgrade in progress.
[INFO][2023-06-23 06:00:10][cexproxy.cap.org] VAMI upgrade still in progress.
[INFO][2023-06-23 06:01:10][cexproxy.cap.org] VAMI checking upgrade progress
[INFO][2023-06-23 06:01:10][cexproxy.cap.org] VAMI upgrade completed succesfully.
[INFO][2023-06-23 06:01:10][cexproxy.cap.org] VAMI upgrade completed successfully.

  • Once the VAMI upgrade is completed , there would be a reboot of the node and then services deployment would trigger


[INFO][2023-06-23 06:02:10][cexproxy.cap.org] Saving artifacts metadata
[INFO][2023-06-23 06:02:11][cexproxy.cap.org] Artifacts metadata saved.
[INFO][2023-06-23 06:03:10][cexproxy.cap.org] Resolving post-upgrade controller.
[INFO][2023-06-23 06:03:10][cexproxy.cap.org] This node is elected for post-upgrade controller.
[INFO][2023-06-23 06:03:12][cexproxy.cap.org] Resolving post-upgrade nodes quorum...
[INFO][2023-06-23 06:03:13][cexproxy.cap.org] Adding nodes to the cluster.
[INFO][2023-06-23 06:03:13][cexproxy.cap.org] Cluster nodes added successfully.
[INFO][2023-06-23 06:03:15][cexproxy.cap.org] Restoring all restore points.
[INFO][2023-06-23 06:03:16][cexproxy.cap.org] Restoring Kubernetes object from local restore point: /data/restorepoint/sys-config/vaconfig
[INFO][2023-06-23 06:03:16][cexproxy.cap.org] Restoration of Kubernetes object from local restore point /data/restorepoint/sys-config/vaconfig completed successfully
[INFO][2023-06-23 06:03:16][cexproxy.cap.org] Triggering vaconfig schema update...
[INFO][2023-06-23 06:03:17][cexproxy.cap.org] Vaconfig schema update completed
[INFO][2023-06-23 06:03:16][cexproxy.cap.org] Looking for pending changes for vaconfig
2023-06-23 06:03:16,938 vra_crd.schema_migration.k8s_obj_rev_manager [DEBUG] Listing pending changes...
**
*
*


2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_rev_manager [DEBUG] Looking into change candidate /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.8.2.100-introduce-fips-mode-property-in-vaconfig.sh...
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_change [DEBUG] Computed revision for change /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.8.2.100-introduce-fips-mode-property-in-vaconfig.sh: 8.8.2.100
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_rev [DEBUG] Compared 8.8.2.100 to 8.12.0.100: -1
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_rev_manager [DEBUG] Looking into change candidate /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.10.2.220-register_liagent_action.py...
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_change [DEBUG] Computed revision for change /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.10.2.220-register_liagent_action.py: 8.10.2.220
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_rev [DEBUG] Compared 8.10.2.220 to 8.12.0.100: -1
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_rev_manager [DEBUG] Looking into change candidate /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.12.0.100-migrate_capabilities_data_model.py...
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_change [DEBUG] Computed revision for change /etc/vmware-prelude/crd-schema-changelogs/vaconfig/8.12.0.100-migrate_capabilities_data_model.py: 8.12.0.100
2023-06-23 06:03:16,978 vra_crd.schema_migration.k8s_obj_schema_rev [DEBUG] Compared 8.12.0.100 to 8.12.0.100: 0
2023-06-23 06:03:16,979 vra_crd.schema_migration.k8s_obj_rev_manager [DEBUG] Computed list of pending changes:
[INFO][2023-06-23 06:03:17][cexproxy.cap.org] Vaconfig is up to date
[INFO][2023-06-23 06:03:17][cexproxy.cap.org] Deactivating LCC maintenance mode.
[INFO][2023-06-23 06:03:17][cexproxy.cap.org] LCC maintenance mode deactivated successfully.
[INFO][2023-06-23 06:03:17][cexproxy.cap.org] Restoration from local restore points completed successfully.
[INFO][2023-06-23 06:03:18][cexproxy.cap.org] Deployment of infrastructure and application services started.
[INFO][2023-06-23 06:04:10][cexproxy.cap.org] Deactivating upgrade monitor on the node
[INFO][2023-06-23 06:04:10][cexproxy.cap.org] Upgrade monitor deactivated successfully on the node.
[INFO][2023-06-23 06:12:44][cexproxy.cap.org] Infrastructure and application services deployed successfully.
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Retrieving services status...
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Services status retrieval succeeded.
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Processing services data...
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Services data processing succeeded.
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Verifying services...
[INFO][2023-06-23 06:12:45][cexproxy.cap.org] Services verification completed successfully.
[INFO][2023-06-23 06:12:46][cexproxy.cap.org] Cleaning up restore point on all nodes.
[INFO][2023-06-23 06:12:46][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-clean.sh sys-config at host: cexproxy.cap.org
[INFO][2023-06-23 06:13:01][cexproxy.cap.org] Cleaning up restore point /data/restorepoint/sys-config .
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 06/2023
[INFO][2023-06-23 06:13:01][cexproxy.cap.org] Restore point /data/restorepoint/sys-config cleaned up successfully.
[INFO][2023-06-23 06:13:01][cexproxy.cap.org] Restore point cleaned up on all nodes.
[INFO][2023-06-23 06:13:01][cexproxy.cap.org] Cleaning up restore point on all nodes.
[INFO][2023-06-23 06:13:01][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-clean.sh artifacts-lastworking at host: cexproxy.cap.org
[INFO][2023-06-23 06:13:15][cexproxy.cap.org] Cleaning up restore point /data/restorepoint/artifacts-lastworking .
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 06/2023
[INFO][2023-06-23 06:13:15][cexproxy.cap.org] Restore point /data/restorepoint/artifacts-lastworking cleaned up successfully.
[INFO][2023-06-23 06:13:15][cexproxy.cap.org] Restore point cleaned up on all nodes.
[INFO][2023-06-23 06:13:15][cexproxy.cap.org] Cleaning up restore point on all nodes.
[INFO][2023-06-23 06:13:15][cexproxy.cap.org] Running remote command: /opt/scripts/upgrade/rstp-clean.sh live-data at host: cexproxy.cap.org
[INFO][2023-06-23 06:13:30][cexproxy.cap.org] Cleaning up restore point /data/restorepoint/live-data .
Pseudo-terminal will not be allocated because stdin is not a terminal.
Welcome to Aria Automation Extensibility Appliance 06/2023
[INFO][2023-06-23 06:13:32][cexproxy.cap.org] Restore point /data/restorepoint/live-data cleaned up successfully.
[INFO][2023-06-23 06:13:32][cexproxy.cap.org] Restore point cleaned up on all nodes.
[INFO][2023-06-23 06:13:42][cexproxy.cap.org] Reverting SSH configuration on nodes.
[INFO][2023-06-23 06:13:43][cexproxy.cap.org] Stopping sshd service..
[INFO][2023-06-23 06:13:44][cexproxy.cap.org] Service sshd stopped.
[INFO][2023-06-23 06:13:44][cexproxy.cap.org] Starting sshd service..
[INFO][2023-06-23 06:13:44][cexproxy.cap.org] Service sshd started.
[INFO][2023-06-23 06:13:44][cexproxy.cap.org] SSH configuration reverted on nodes successfully.
[INFO][2023-06-23 06:14:01][cexproxy.cap.org] Cleaning up upgrade runtime state.
[INFO][2023-06-23 06:14:11][cexproxy.cap.org] Archiving upgrade runtime data.
[INFO][2023-06-23 06:14:13][cexproxy.cap.org] Upgrade runtime data archived successfully.
[INFO][2023-06-23 06:14:13][cexproxy.cap.org] Clearing upgrade runtime directory.
[INFO][2023-06-23 06:14:13][cexproxy.cap.org] Upgrade runtime directory cleared successfully.
[INFO][2023-06-23 06:14:14][cexproxy.cap.org] Upgrade runtime clean up completed.

  • Check pods status



  • Upgrade Status



  • This concludes the upgrade procedure of cloud extensibility proxy


 


498 views0 comments

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page