top of page

Search Results

247 results found with an empty search

  • Re-Trust vRA with vIDM from vRSLCM | Deep-Dive

    Overview On a high level , these are steps which occurs when you perform Re-Trust with Identity Manager from vRSLCM Start Start VMware Identity Manager flow Check if VMware Identity Manager root certificate is present on vRealize Automation Check for VMware Identity Manager Instance availability Check for VMware Identity Manager login token VMware Identity Manager health check for vRealize Automation Check for VMware Identity Manager configuration user availability Configure VMware Identity Manager for vRealize Automation Initialize vRealize Automation Update vRealize Automation certificate in vRealize Lifecycle Manager Inventory Update VMware Identity Manager allowed redirects Final Disclaimer: Hostnames used are examples from my lab and do not represent any company or organization Deep-Dive Let's deep dive on the actions performed when you Re-Trust with Identity manager on vRA product from vRSLCM 1. When a request is submitted in vRSLCM , a request to request service is created 2022-07-28 00:27:37.142 INFO [http-nio-8080-exec-5] c.v.v.l.l.u.EnvironmentValidationHelper - -- Product with ID : vraproductFound : true 2022-07-28 00:27:37.211 INFO [http-nio-8080-exec-5] c.v.v.l.l.u.RequestSubmissionUtil - -- ++++++++++++++++++ Creating request to Request_Service :::>>> { "vmid" : "eb90d4cc-1c58-4a6c-a7f5-8f3dea65486c", "transactionId" : null, "tenant" : "default", "requestName" : "vidmproductretrust", "requestReason" : "VRA in Environment Production - Re-trust Product with Identity Manager", "requestType" : "PRODUCT_VIDM_RETRUST", "requestSource" : "cf8ac4ce-a7a7-4958-8401-50efdf4f1489", "requestSourceType" : "user", "inputMap" : { "environmentId" : "cf8ac4ce-a7a7-4958-8401-50efdf4f1489", "productId" : "vra" }, "outputMap" : { }, "state" : "CREATED", "executionId" : null, "executionPath" : null, "executionStatus" : null, "errorCause" : null, "resultSet" : null, "isCancelEnabled" : null, "lastUpdatedOn" : 1658968057210, "createdBy" : null } 2. Request ID response 2022-07-28 00:27:37.229 INFO [http-nio-8080-exec-5] c.v.v.l.l.u.RequestSubmissionUtil - -- Generic Request Response : { "requestId" : "eb90d4cc-1c58-4a6c-a7f5-8f3dea65486c" } 3. Identifies if it is a clustered vIDM or a standalone one , fetches hostname and root password for vIDM. 2022-07-28 00:27:37.798 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- Number of request to be processed : 1 2022-07-28 00:27:37.855 INFO [scheduling-1] c.v.v.l.r.c.p.ProductReregisterNRetrustPlanner - -- Found product with id vidm 2022-07-28 00:27:37.865 INFO [scheduling-1] c.v.v.l.r.c.p.CreateEnvironmentPlanner - -- Not a clustered vIDM, fetching the hostname from primary node. 2022-07-28 00:27:37.873 INFO [scheduling-1] c.v.v.l.r.c.p.CreateEnvironmentPlanner - -- Not a clustered vIDM, fetching from primary node. 2022-07-28 00:27:37.875 INFO [scheduling-1] c.v.v.l.r.c.p.CreateEnvironmentPlanner - -- Base tenant id: idm 2022-07-28 00:27:37.876 INFO [scheduling-1] c.v.v.l.r.c.p.CreateEnvironmentPlanner - -- Fetching the hostname and root password YXYXYXYX primary node. 2022-07-28 00:27:37.883 INFO [scheduling-1] c.v.v.l.r.c.p.ProductReregisterNRetrustPlanner - -- Found product with id vra and version above 8.0.0 2022-07-28 00:27:37.904 INFO [scheduling-1] c.v.v.l.r.c.p.ProductReregisterNRetrustPlanner - -- lbTermination value passed YXYXYXYX is :: false 4. Product Re-register and Re-trust Planner SPEC is logged 2022-07-28 00:27:37.906 INFO [scheduling-1] c.v.v.l.r.c.p.ProductReregisterNRetrustPlanner - -- Product Re-register and Re-trust Planner SPEC :: { "vmid" : "e98ba0cb-7a17-4ed0-ae45-93255220cdf7", "tenant" : "default", "originalRequest" : null, "enhancedRequest" : null, "symbolicName" : null, "acceptEula" : false, "variables" : { }, "products" : [ { "symbolicName" : "vravaretrustvidm", "displayName" : null, "productVersion" : null, "priority" : 0, "dependsOn" : [ ], "components" : [ { "component" : { "symbolicName" : "vravaretrustvidm", "type" : null, "componentVersion" : null, "properties" : { "cafeHostNamePrimary" : "vra.cap.org", "cafeRootPasswordPrimary" : "JXJXJXJX", "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "baseTenantId" : "idm", "uberAdminUserType" : "LOCAL", "version" : "8.8.1", "masterVidmAdminPassword" : "JXJXJXJX", "uberAdmin" : "configadmin", "masterVidmEnabled" : "true", "__version" : "8.8.1", "uberAdminPassword" : "JXJXJXJX", "masterVidmHostName" : "idm.cap.org", "masterVidmAdminUserName" : "admin", "isLBSslTerminated" : "false", "authProviderHostnames" : "idm.cap.org", "vidmPrimaryNodeHostname" : "idm.cap.org" } }, "priority" : 0 } ] } ] } 5. Engine request is triggered 2022-07-28 00:27:37.910 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- ENGINE REQUEST :: { "vmid" : "e98ba0cb-7a17-4ed0-ae45-93255220cdf7", "tenant" : "default", "originalRequest" : null, "enhancedRequest" : null, "symbolicName" : null, "acceptEula" : false, "variables" : { }, "products" : [ { "symbolicName" : "vravaretrustvidm", "displayName" : null, "productVersion" : null, "priority" : 0, "dependsOn" : [ ], "components" : [ { "component" : { "symbolicName" : "vravaretrustvidm", "type" : null, "componentVersion" : null, "properties" : { "cafeHostNamePrimary" : "vra.cap.org", "cafeRootPasswordPrimary" : "JXJXJXJX", "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "baseTenantId" : "idm", "uberAdminUserType" : "LOCAL", "version" : "8.8.1", "masterVidmAdminPassword" : "JXJXJXJX", "uberAdmin" : "configadmin", "masterVidmEnabled" : "true", "__version" : "8.8.1", "uberAdminPassword" : "JXJXJXJX", "masterVidmHostName" : "idm.cap.org", "masterVidmAdminUserName" : "admin", "isLBSslTerminated" : "false", "authProviderHostnames" : "idm.cap.org", "vidmPrimaryNodeHostname" : "idm.cap.org" } }, "priority" : 0 } ] } ] } 6. Engine request is processed where a suite creation request is successful 2022-07-28 00:27:37.914 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vravaretrustvidm 2022-07-28 00:27:37.920 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vravaretrustvidm 2022-07-28 00:27:37.920 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-28 00:27:37.948 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:37.949 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-28 00:27:37.958 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-28 00:27:37.958 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:37.959 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:37.959 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.005 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vravaretrustvidm', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vravaretrustvidm.vmfx'} 2022-07-28 00:27:38.022 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- ENGINE REQUEST STATUS :: { "vmid" : "14adc2bc-39a5-47f1-b139-144a1eab4936", "transactionId" : null, "tenant" : "default", "message" : "Suite Creation Request is Successful", "identifier" : "bcbf42fe-3a24-4109-8e9e-a573b9e67d5b", "type" : "SUCCESS", "executionPath" : "{\"vmid\":\"e98ba0cb-7a17-4ed0-ae45-93255220cdf7\",\"tenant\":\"default\", * * {\"name\":\"OnVidmUpdateAllowRedirectSuccess\",\"source\":\"com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask\",\"destination\":\"com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask\",\"type\":\"SM_EVENT\",\"properties\":{},\"uiProperties\":{\"displayText\":\"Updating VMware Identity Manager Allow redirect success\",\"displayKey\":\"vmf::sm::vravaretrustvidm::edge::name::OnVidmUpdateAllowRedirectSuccess::source::c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask\"}}]}}]}" } 7. Request set to IN PROGRESS 2022-07-28 00:27:38.033 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- REQUEST TO BE UPDATED :: { "vmid" : "eb90d4cc-1c58-4a6c-a7f5-8f3dea65486c", "transactionId" : null, "tenant" : "default", "requestName" : "vidmproductretrust", "requestReason" : "VRA in Environment Production - Re-trust Product with Identity Manager", "requestType" : "PRODUCT_VIDM_RETRUST", "requestSource" : "cf8ac4ce-a7a7-4958-8401-50efdf4f1489", "requestSourceType" : "user", "inputMap" : { "environmentId" : "cf8ac4ce-a7a7-4958-8401-50efdf4f1489", "productId" : "vra" }, "outputMap" : { }, "state" : "INPROGRESS", "executionId" : "bcbf42fe-3a24-4109-8e9e-a573b9e67d5b", "executionPath" : "{\"vmid\":\"e98ba0cb-7a17-4ed0-ae45-93255220cdf7\",\"tenant\":\"default\",\"symbolicName\":null,\"acceptEula\":false,\"variables\":{},\"products\":[{\"symbolicName\":\"vravaretrustvidm\",\"symbolicNameTxt\":null,\"productVersion\":null,\"priority\":0,\"dependsOn\":[],\"components\":[{\"component\":{\"symbolicName\":\"vravaretrustvidm\",\"type\":null,\"componentVersion\":null}}],\"vmf\":{ * * {\"name\":\"OnVidmUpdateAllowRedirectSuccess\",\"source\":\"com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask\",\"destination\":\"com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask\",\"type\":\"SM_EVENT\",\"properties\":{},\"uiProperties\":{\"displayText\":\"Updating VMware Identity Manager Allow redirect success\",\"displayKey\":\"vmf::sm::vravaretrustvidm::edge::name::OnVidmUpdateAllowRedirectSuccess::source::c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask\"}}]}}]}", "executionStatus" : null, "errorCause" : null, "resultSet" : null, "isCancelEnabled" : null, "lastUpdatedOn" : 1658968057228, "createdBy" : "admin@local" } 2022-07-28 00:27:38.038 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- Processing request with ID : eb90d4cc-1c58-4a6c-a7f5-8f3dea65486c with request type PRODUCT_VIDM_RETRUST with request state INPROGRESS. 8. Queing and Saving the request 2022-07-28 00:27:38.322 INFO [scheduling-1] c.v.v.l.a.c.UserRequestProcessor - -- QUEING NEW USER REQUEST :: { "vmid" : "bcbf42fe-3a24-4109-8e9e-a573b9e67d5b", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658968058018, "lastUpdatedOn" : 1658968058018, "version" : "8.1.0.0", "vrn" : null, "status" : "CREATED", "originalRequest" : null, "enhancedRequest" : null, "parameter" : "{\"vmid\":\"e98ba0cb-7a17-4ed0-ae45-93255220cdf7\",\"tenant\":\"default\",\"originalRequest\":null,\"enhancedRequest\":null, * * :{\"displayText\":\"Updating VMware Identity Manager Allow redirect success\",\"displayKey\":\"vmf::sm::vravaretrustvidm::edge::name::OnVidmUpdateAllowRedirectSuccess::source::c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask\"}}]}}]}", "information" : "", "lastExexutedEvent" : "", "type" : "SUITE", "lastProcessedPriority" : -1, "userRequestLock" : 0 2022-07-28 00:27:38.335 INFO [scheduling-1] c.v.v.l.a.g.s.UserRequestServiceImpl - -- Saving user request 9. Product Specification is logged 2022-07-28 00:27:38.346 INFO [scheduling-1] c.v.v.l.a.c.UserRequestProcessor - -- Product Specification :: { "symbolicName" : "vravaretrustvidm", "displayName" : null, "productVersion" : null, "priority" : 0, "dependsOn" : [ ], "components" : [ { "component" : { "symbolicName" : "vravaretrustvidm", "type" : null, "componentVersion" : null, "properties" : { "cafeHostNamePrimary" : "vra.cap.org", "cafeRootPasswordPrimary" : "JXJXJXJX", "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "baseTenantId" : "idm", "uberAdminUserType" : "LOCAL", "version" : "8.8.1", "masterVidmAdminPassword" : "JXJXJXJX", "uberAdmin" : "configadmin", "masterVidmEnabled" : "true", "__version" : "8.8.1", "uberAdminPassword" : "JXJXJXJX", "masterVidmHostName" : "idm.cap.org", "masterVidmAdminUserName" : "admin", "isLBSslTerminated" : "false", "authProviderHostnames" : "idm.cap.org", "vidmPrimaryNodeHostname" : "idm.cap.org" } }, "priority" : 0 } ] } 2022-07-28 00:27:38.347 INFO [scheduling-1] c.v.v.l.a.c.UserRequestProcessor - -- GETTING SPEC FOR (productSymbolicName) :: vravaretrustvidm 2022-07-28 00:27:38.347 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vravaretrustvidm 2022-07-28 00:27:38.347 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vravaretrustvidm 2022-07-28 00:27:38.348 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-28 00:27:38.370 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.371 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-28 00:27:38.372 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-28 00:27:38.372 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.372 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.372 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.374 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vravaretrustvidm', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vravaretrustvidm.vmfx'} 2022-07-28 00:27:38.375 INFO [scheduling-1] c.v.v.l.a.c.UserRequestProcessor - -- MACHINE :: {"symbolicName":"vravaretrustvidm","type":"STATEMACHINE","version":"1.0.0","startState":"com.vmware.vrealize.lcm.vidm.core.task.StartVidmGenericTask","finishState":"","errorState":"","properties":{},"uiProperties":{"displayText":"Re-trust VMware Identity Manager on vRealize Automation","displayKey":"vmf::sm::vravaretrustvidm"},"nodes":[{"symbolicName":"com.vmware.vrealize.lcm.vidm.core.task.StartVidmGenericTask","symbolicNameTxt":null,"type":"SIMPLE","task":"com.vmware.vrealize.lcm.vidm.core.task.StartVidmGenericTask","properties":{},"uiProperties":{"displayText":"Start VMware Identity Manager flow","displayKey":"vmf::sm::vravaretrustvidm::node::c.v.v.l.v.c.t.StartVidmGenericTask"}}, * * *01 completion","displayKey":"vmf::sm::vravaretrustvidm::edge::name::OnVravaCertificateInventoryCompletion::source::c.v.v.l.p.c.v.t.VraVaUpdateCertificateInInventoryTask"}},{"name":"OnVidmUpdateAllowRedirectSuccess","source":"com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask","destination":"com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask","type":"SM_EVENT","properties":{},"uiProperties":{"displayText":"Updating VMware Identity Manager Allow redirect success","displayKey":"vmf::sm::vravaretrustvidm::edge::name::OnVidmUpdateAllowRedirectSuccess::source::c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask"}}]} 2022-07-28 00:27:38.908 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- => bcbf42fe-3a24-4109-8e9e-a573b9e67d5b 2022-07-28 00:27:38.912 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- Processing the Engine Request to create the machine with ID => vravaretrustvidm and the priority is => 0 10. Starts processing the engine request 2022-07-28 00:27:38.908 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- => bcbf42fe-3a24-4109-8e9e-a573b9e67d5b 2022-07-28 00:27:38.912 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- Processing the Engine Request to create the machine with ID => vravaretrustvidm and the priority is => 0 2022-07-28 00:27:38.912 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vravaretrustvidm 2022-07-28 00:27:38.912 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vravaretrustvidm 2022-07-28 00:27:38.913 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-28 00:27:38.942 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.943 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-28 00:27:38.943 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-28 00:27:38.943 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.944 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.944 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:38.947 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vravaretrustvidm', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vravaretrustvidm.vmfx'} 2022-07-28 00:27:38.955 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- Injected OnStart Edge for the Machine ID :: vravaretrustvidm 2022-07-28 00:27:38.998 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "846ee8ea-e561-4906-8e5d-f5960a431dbc", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658968058953, "lastUpdatedOn" : 1658968058979, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnStart", "currentState" : null, "eventArgument" : "{\"productSpec\":{\"name\":\"productSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ProductSpecification\",\"value\":\"{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"displayName\\\":null,\\\"productVersion\\\":null,\\\"priority\\\":0,\\\"dependsOn\\\":[],\\\"components \\\":[{\\\"component\\\":{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"cafeHostNamePrimary\\\":\\\"vra.cap.org\\\",\\\"cafeRootPasswordPrimary\\\":\\\"JXJXJXJX\\\",\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"b aseTenantId\\\":\\\"idm\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"version\\\":\\\"8.8.1\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"__version\\\":\\\"8.8.1\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\ \\",\\\"masterVidmHostName\\\":\\\"idm.cap.org\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"isLBSslTerminated\\\":\\\"false\\\",\\\"authProviderHostnames\\\":\\\"idm.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"idm.cap.org\\\"}},\\\"priority\\\":KXKXKXKX\"}}", "status" : "CREATED", "stateMachineInstance" : "ba1d3e0f-6c87-4e29-bca6-c347dd9bda6b", "errorCause" : null, "sequence" : 563559, "eventLock" : 1, "engineNodeId" : "lcm.cap.org" } 11. On vIDM Generic is initialized 2022-07-28 00:27:39.074 INFO [pool-3-thread-24] c.v.v.l.p.a.s.Task - -- KEY PICKER IS :: com.vmware.vrealize.lcm.drivers.commonplugin.task.keypicker.GenericProductSpecKeyPicker 2022-07-28 00:27:39.559 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "e596cea2-9b2d-4454-a5c5-811b52d35d86", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658968059076, "lastUpdatedOn" : 1658968059540, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnVidmGenericInitialized", "currentState" : null, "eventArgument" : "{\"componentSpec\":{\"name\":\"componentSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ComponentDeploymentSpecification\",\"value\":\"{\\\"component\\\":{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"cafeHostNamePrimary\\\":\\\"vra.cap.org\\\",\\\"cafeRootPasswordPrimary\\\":\\\"JXJXJXJX\\\",\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"baseTenantId\\\":\\\"idm\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"version\\\":\\\"8.8.1\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"__version\\\":\\\"8.8.1\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"masterVidmHostName\\\":\\\"idm.cap.org\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"isLBSslTerminated\\\":\\\"false\\\",\\\"authProviderHostnames\\\":\\\"idm.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"idm.cap.org\\\"}},\\\"priority\\\":KXKXKXKX\"},\"productSpec\":{\"name\":\"productSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ProductSpecification\",\"value\":\"{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"displayName\\\":null,\\\"productVersion\\\":null,\\\"priority\\\":0,\\\"dependsOn\\\":[],\\\"components\\\":[{\\\"component\\\":{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"cafeHostNamePrimary\\\":\\\"vra.cap.org\\\",\\\"cafeRootPasswordPrimary\\\":\\\"JXJXJXJX\\\",\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"baseTenantId\\\":\\\"idm\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"version\\\":\\\"8.8.1\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"__version\\\":\\\"8.8.1\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"masterVidmHostName\\\":\\\"idm.cap.org\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"isLBSslTerminated\\\":\\\"false\\\",\\\"authProviderHostnames\\\":\\\"idm.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"idm.cap.org\\\"}},\\\"priority\\\":KXKXKXKX\"}}", "status" : "CREATED", "stateMachineInstance" : "ba1d3e0f-6c87-4e29-bca6-c347dd9bda6b", "errorCause" : null, "sequence" : 563561, "eventLock" : 1, "engineNodeId" : "lcm.cap.org" } 12. vRA VA check vIDM root certificate task is initiated. On this task "vracli -j vidm " command is executed. A response is successfully received 2022-07-28 00:27:39.613 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaCheckVidmRootCertificateTask 2022-07-28 00:27:39.618 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaCheckVidmRootCertificateTask 2022-07-28 00:27:39.657 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:27:39.663 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.668 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.672 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.675 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:27:39.677 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.679 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.682 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:39.684 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:27:39.684 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:27:39.688 INFO [pool-3-thread-39] c.v.v.l.p.c.v.t.VraVaCheckVidmRootCertificateTask - -- Starting VraVaCheckVidmRootCertificate Task vravaretrustvidm 2022-07-28 00:27:39.699 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command to be run : vracli -j vidm 2022-07-28 00:27:39.701 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:27:39.702 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: vracli -j vidm 2022-07-28 00:27:39.815 INFO [pool-3-thread-39] c.v.v.l.u.SshUtils - -- Executing command --> vracli -j vidm 2022-07-28 00:27:41.377 INFO [pool-3-thread-39] c.v.v.l.u.SshUtils - -- exit-status: 0 2022-07-28 00:27:41.382 INFO [pool-3-thread-39] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:27:41.383 INFO [pool-3-thread-39] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "{\"status_code\": 0, \"output\": {\"cert\": \"-----BEGIN CERTIFICATE-----\\nMIIDzjCCAragAwIBAgIGAX2PFLkmMA0GCSqGSIb3DQEBCwUAMFMxMzAxBgNVBAMM\\nKnZSZWFsaXplIFN1aXRlIExpZmVjeWNsZSBNYW5hZ2VyIExvY2tlciBDQTEPMA0G\\nA1UECgwGVk13YXJlMQswCQYDVQQGEwJJTjAeFw0yMTEyMDYwOTMwMzlaFw0yMzEy\\nMDYwOTMwMzlaMFkxFDASBgNVBAMMC2lkbS5jYXAub3JnMQwwCg * * * FAl4dGC3jhaO9+icDZiKBb5hsR/wntUzF9Nqns9JoABFdfq+FDqmhw+U\\nak9o9GsHpGVghl7Vs2ExneVikFm9bbon2QucASTLXvc6wXD7kkRSqTG/DvoDVvuI\\nSyplnOpNDLcnWyNB+V7djqYE6ybARErFLKk4LGiJxvunVTl3U5L8HsxqQy+Au9gF\\ngAnnMWcisaDxDMeuR5/Ome1RybvvZ27YeAPe5t+y6aICi8m1g/bF3+naHtKPFa50\\n/G9JkfAgPJSsczhG6XoDBz0cTz44EK2QLM6fHptn0m1oCi5pNuvg40KWWHgmJhHO\\ntN/HBMZylOQGHWDwiVWkUOw0\\n-----END CERTIFICATE-----\\n\", \"clients\": {\"ClientID\": \"prelude-UyLAxiyMkl\", \"ClientIDUser\": \"prelude-user-elkli9TRYF\", \"ClientSecret\": \"JXJXJXJX\", \"ClientSecretUser\": \"JXJXJXJX\"}, \"defaultOrgAlias\": \"\", \"defaultOrgName\": \"IDM\", \"isDefaultOrgAliasUpdated\": true, \"sha256\": \"b41714bbd62d342281986e0c80533d179de47579ccef7b0037da4c98f23010de\", \"url\": \"https://idm.cap.org\", \"user\": \"configadmin\", \"verify_cert\": false}, \"error\": \"\", \"logs\": \"\"}\n", "errorData" : null, "commandTimedOut" : false } 2022-07-28 00:27:41.386 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 0 2022-07-28 00:27:41.390 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:27:41.390 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:27:41.391 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:27:41.392 INFO [pool-3-thread-39] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- {"status_code": 0, "output": {"cert": "-----BEGIN CERTIFICATE-----\nMIIDzjCCAragAwIBAgIGAX2PFLkmMA0GCSqGSIb3DQEBCwUAMFMxMzAxBgNVBAMM\nKnZSZWFsaXplIFN1aXRlIExpZmVjeWNsZSBNYW5hZ2VyIExvY2tlciBDQTEPMA0G\nA1UECgwGVk13YXJlMQswCQYDVQQGEwJJTjAeFw0yMTEyMDYwOTMwMzlaFw0yMzEy\nMDYwOTMwMzlaMFkxFDASBgNVBAMMC2lkbS5jYXAub3JnMQwwCgYDVQ* * DZiKBb5hsR/wntUzF9Nqns9JoABFdfq+FDqmhw+U\nak9o9GsHpGVghl7Vs2ExneVikFm9bbon2QucASTLXvc6wXD7kkRSqTG/DvoDVvuI\nSyplnOpNDLcnWyNB+V7djqYE6ybARErFLKk4LGiJxvunVTl3U5L8HsxqQy+Au9gF\ngAnnMWcisaDxDMeuR5/Ome1RybvvZ27YeAPe5t+y6aICi8m1g/bF3+naHtKPFa50\n/G9JkfAgPJSsczhG6XoDBz0cTz44EK2QLM6fHptn0m1oCi5pNuvg40KWWHgmJhHO\ntN/HBMZylOQGHWDwiVWkUOw0\n-----END CERTIFICATE-----\n", "clients": {"ClientID": "prelude-UyLAxiyMkl", "ClientIDUser": "prelude-user-elkli9TRYF", "ClientSecret": "JXJXJXJX", "ClientSecretUser": "JXJXJXJX"}, "defaultOrgAlias": "", "defaultOrgName": "IDM", "isDefaultOrgAliasUpdated": true, "sha256": "b41714bbd62d342281986e0c80533d179de47579ccef7b0037da4c98f23010de", "url": "https://idm.cap.org", "user": "configadmin", "verify_cert": false}, "error": "", "logs": ""} * * * 2022-07-28 00:27:41.430 INFO [pool-3-thread-39] c.v.v.l.p.c.v.t.VraVaCheckVidmRootCertificateTask - -- vIDM details retrieved from vRA : Output [clients=Clients [ClientIDUser=null, ClientID=null, ClientSecretUser=KXKXKXKX, sha265=null, cert=-----BEGIN CERTIFICATE----- MIIDzjCCAragAwIBAgIGAX2PFLkmMA0GCSqGSIb3DQEBCwUAMFMxMzAxBgNVBAMM KnZSZWFsaXplIFN1aXRlIExpZmVjeWNsZSBNYW5hZ2VyIExvY2tlciBDQTEPMA0G * * * /G9JkfAgPJSsczhG6XoDBz0cTz44EK2QLM6fHptn0m1oCi5pNuvg40KWWHgmJhHO tN/HBMZylOQGHWDwiVWkUOw0 -----END CERTIFICATE----- , defaultOrgName=IDM, user=configadmin, url=https://idm.cap.org] 2022-07-28 00:27:41.453 INFO [pool-3-thread-39] c.v.v.l.u.CertificateUtil - -- requestUrl : https://idm.cap.org 2022-07-28 00:27:41.498 INFO [pool-3-thread-39] c.v.v.l.p.c.v.t.VraVaCheckVidmRootCertificateTask - -- vIDM CA certificates thumbprints: [D5236B54***9746E24052] 2022-07-28 00:27:41.501 INFO [pool-3-thread-39] c.v.v.l.p.c.v.t.VraVaCheckVidmRootCertificateTask - -- Thumbprints of vIDM certificates retrieved from vRA: [D5236B54***9746E24052] 13. Starts vIDM Instance availability task 2022-07-28 00:27:42.036 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vravaretrustvidm 2022-07-28 00:27:42.037 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vravaretrustvidm 2022-07-28 00:27:42.037 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-28 00:27:42.065 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:42.066 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-28 00:27:42.066 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-28 00:27:42.066 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:42.067 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:42.067 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:27:42.068 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vravaretrustvidm', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vravaretrustvidm.vmfx'} 2022-07-28 00:27:42.069 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVraVaVidmRootCertificateNotPresent 2022-07-28 00:27:42.069 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaCheckVidmRootCertificateTask 2022-07-28 00:27:42.070 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmInstanceAvailabilityCheckTask 2022-07-28 00:27:42.075 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.vidm.core.task.VidmInstanceAvailabilityCheckTask 2022-07-28 00:27:42.099 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:27:42.100 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.104 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.108 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.111 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.113 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:27:42.114 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.115 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.117 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.118 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.120 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:27:42.121 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:27:42.128 INFO [pool-3-thread-36] c.v.v.l.v.c.t.VidmInstanceAvailabilityCheckTask - -- Starting vIDM instance availability check task 2022-07-28 00:27:42.409 INFO [pool-3-thread-36] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmInstanceAvailabilityCheckSuccess 14. VidmLoginTokenCheckTask is successful too as the token was obtained 2022-07-28 00:27:42.642 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmInstanceAvailabilityCheckTask 2022-07-28 00:27:42.642 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmLoginTokenCheckTask 2022-07-28 00:27:42.646 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.vidm.core.task.VidmLoginTokenCheckTask 2022-07-28 00:27:42.668 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: configurationPropertyService 2022-07-28 00:27:42.670 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:27:42.672 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.676 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.679 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:27:42.683 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAd 2022-07-28 00:29:42.898 INFO [pool-3-thread-45] c.v.v.l.v.c.t.VidmLoginTokenCheckTask - -- vIDM login token obtained. Proceeding to next task 2022-07-28 00:29:42.960 INFO [pool-3-thread-45] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmLoginTokenCheckSuccess minPassword<=KXKXKXKX KEY :: XXXXXX 15. vIDM healthcheck for vRA task is initiated 2022-07-28 00:29:43.232 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVidmLoginTokenCheckSuccess 2022-07-28 00:29:43.233 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmLoginTokenCheckTask 2022-07-28 00:29:43.233 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmHealthCheck4vRATask 2022-07-28 00:29:43.238 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.vidm.core.task.VidmHealthCheck4vRATask 2022-07-28 00:29:43.275 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: configurationPropertyService 2022-07-28 00:29:43.276 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:29:43.277 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.283 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.287 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.290 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.293 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:29:43.294 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:29:43.302 INFO [pool-3-thread-30] c.v.v.l.v.c.t.VidmHealthCheck4vRATask - -- Starting vIDM health check task 2022-07-28 00:29:43.306 INFO [pool-3-thread-30] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- vIDM Configuration property healthChkSleepMaxRetries value is obtained from Config Service : 60 2022-07-28 00:29:43.308 INFO [pool-3-thread-30] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- vIDM Configuration property healthChkSleepTimeMillis value is obtained from Config Service : 10000 2022-07-28 00:29:43.436 INFO [pool-3-thread-30] c.v.v.l.v.d.r.c.VidmRestClient - -- API Response Status : 200 Response Message : {"clusterInstances":[{"version":"3.3.6.0 Build 19203469","uuid":"34f02008-8429-329b-808f-75724dcc3695","status":"Active","lastUpdated":1658968171385,"hostname":"idm.cap.org","datacenterId":1,"id":2,"ipaddress":"*.**.****.*"}],"_links":{}} 2022-07-28 00:29:43.438 INFO [pool-3-thread-30] c.v.v.l.v.d.r.u.VidmServerRestUtil - -- String response vIDM get all cluster instance : VidmRestClientResponseDTO [statusCode=200, responseMessage={"clusterInstances":[{"version":"3.3.6.0 Build 19203469","uuid":"34f02008-8429-329b-808f-75724dcc3695","status":"Active","lastUpdated":1658968171385,"hostname":"idm.cap.org","datacenterId":1,"id":2,"ipaddress":"*.**.****.*"}],"_links":{}}] 2022-07-28 00:29:43.447 INFO [pool-3-thread-30] c.v.v.l.v.c.t.VidmHealthCheck4vRATask - -- Successfully verified status of the vIDM nodes. 2022-07-28 00:29:43.447 INFO [pool-3-thread-30] c.v.v.l.v.c.t.VidmHealthCheck4vRATask - -- vIDM Health is OK. Proceeding to next task. 2022-07-28 00:29:43.447 INFO [pool-3-thread-30] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmHealthCheckTaskCompletion 2022-07-28 00:29:43.448 INFO [pool-3-thread-30] c.v.v.l.p.a.s.Task - -- ======================================== { "componentSpec" : { "object" : { "component" : { "symbolicName" : "vravaretrustvidm", "type" : null, "componentVersion" : null, "properties" : { "cafeHostNamePrimary" : "vra.cap.org", "cafeRootPasswordPrimary" : "JXJXJXJX", "vidmPrimaryNodeRootPassword" : "JXJXJXJX", "baseTenantId" : "idm", "uberAdminUserType" : "LOCAL", "version" : "8.8.1", "masterVidmAdminPassword" : "JXJXJXJX", "uberAdmin" : "configadmin", "masterVidmEnabled" : "true", "__version" : "8.8.1", "uberAdminPassword" : "JXJXJXJX", "masterVidmHostName" : "idm.cap.org", "masterVidmAdminUserName" : "admin", "isLBSslTerminated" : "false", "authProviderHostnames" : "idm.cap.org", "vidmPrimaryNodeHostname" : "idm.cap.org" } }, "priority" : 0 * * * 2022-07-28 00:29:43.450 INFO [pool-3-thread-30] c.v.v.l.p.a.s.Task - -- FIELD NAME :: componentSpec 2022-07-28 00:29:43.450 INFO [pool-3-thread-30] c.v.v.l.p.a.s.Task - -- KEY PICKER IS :: com.vmware.vrealize.lcm.drivers.commonplugin.task.keypicker.GenericComponentSpecKeyPicker 2022-07-28 00:29:43.759 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "e100878f-639e-4689-8584-79e9ebe9170b", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658968183452, "lastUpdatedOn" : 1658968183740, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnVidmHealthCheckTaskCompletion", "currentState" : null, "eventArgument" : "{\"componentSpec\":{\"name\":\"componentSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ComponentDeploymentSpecification\",\"value\":\"{\\\"component\\\":{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"cafeHostNamePrimary\\\":\\\"vra.cap.org\\\",\\\"cafeRootPasswordPrimary\\\":\\\"JXJXJXJX\\\",\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"baseTenantId\\\":\\\"idm\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"version\\\":\\\"8.8.1\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"__version\\\":\\\"8.8.1\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"masterVidmHostName\\\":\\\"idm.cap.org\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"isLBSslTerminated\\\":\\\"false\\\",\\\"authProviderHostnames\\\":\\\"idm.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"idm.cap.org\\\"}},\\\"priority\\\":KXKXKXKX\"}}", "status" : "CREATED", "stateMachineInstance" : "ba1d3e0f-6c87-4e29-bca6-c347dd9bda6b", "errorCause" : null, "sequence" : 563577, "eventLock" : 1, "engineNodeId" : "lcm.cap.org" } 16. vIDM Super User availability check is triggered and completed 2022-07-28 00:29:43.766 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: vravaretrustvidm 2022-07-28 00:29:43.766 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->vravaretrustvidm 2022-07-28 00:29:43.767 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-28 00:29:43.797 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:29:43.797 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-28 00:29:43.798 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-28 00:29:43.798 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/vravaretrustvidm.vmfx 2022-07-28 00:29:43.798 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:29:43.798 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/vravaretrustvidm.vmfx 2022-07-28 00:29:43.800 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='vravaretrustvidm', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/vravaretrustvidm.vmfx'} 2022-07-28 00:29:43.802 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVidmHealthCheckTaskCompletion 2022-07-28 00:29:43.803 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmHealthCheck4vRATask 2022-07-28 00:29:43.803 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmSuperUserAvailabilityCheckTask 2022-07-28 00:29:43.809 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.vidm.core.task.VidmSuperUserAvailabilityCheckTask 2022-07-28 00:29:43.813 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: configurationPropertyService 2022-07-28 00:29:43.814 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:29:43.815 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.817 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.820 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.821 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:43.823 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:29:43.823 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:29:43.827 INFO [pool-3-thread-3] c.v.v.l.v.c.t.VidmSuperUserAvailabilityCheckTask - -- Starting vIDM super user availability task 2022-07-28 00:29:43.829 INFO [pool-3-thread-3] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- vIDM Configuration property userChkSleepMaxRetries value is obtained from Config Service : 120 2022-07-28 00:29:43.830 INFO [pool-3-thread-3] c.v.v.l.v.c.t.u.VidmInstallTaskUtil - -- vIDM Configuration property userChkSleepTimeMillis value is obtained from Config Service : 10000 2022-07-28 00:29:43.911 INFO [pool-3-thread-3] c.v.v.l.v.d.r.u.VidmUserGroupMgmtRestUtil - -- VidmUserGroupMgmtRestUtil::getUserByUsernameAndUserType - Request to fetch user in a group 2022-07-28 00:29:43.950 INFO [pool-3-thread-3] c.v.v.l.v.d.r.c.VidmRestClient - -- API Response Status : 200 Response Message : {"totalResults":1,"itemsPerPage":1,"startIndex":1,"schemas":["urn:scim:schemas:core:1.0","urn:scim:schemas:extension:workspace:1.0","urn:scim:schemas:extension:enterprise:1.0","urn:scim:schemas:extension:workspace:mfa:1.0"],"Resources":[{"active":true,"userName":"configadmin","id":"a58bddc6-314b-4220-9ba4-cea4d9c5dbff","meta":{"created":"2021-12-06T10:05:21.025Z","lastModified":"2021-12-06T10:05:21.992Z","location":"https://idm.cap.org/SAAS/jersey/manager/api/scim/Users/a58bddc6-314b-4220-9ba4-cea4d9c5dbff","version":"W/\"1638785121992\""},"name":{"givenName":"configadmin","familyName":"configadmin"},"emails":[{"value":"configadmin@vsphere.local"}],"groups":[{"value":"237386ee-7f61-4d3a-93fa-1569d4bf673a","type":"direct","display":"ALL USERS"}],"roles":[{"value":"84a56b68-f8d5-4b9e-a365-92ef2adb3fb3","display":"User"},{"value":"55048dee-fe1b-404a-936d-3e0b86a7209e","display":"Administrator"}],"urn:scim:schemas:extension:workspace:1.0":{"internalUserType":"LOCAL","userStatus":"1","domain":"System Domain","userStoreUuid":"2e0fefa1-077e-423a-8095-c5f46b0d1827"}}]} 2022-07-28 00:29:43.952 INFO [pool-3-thread-3] c.v.v.l.v.d.r.u.VidmUserGroupMgmtRestUtil - -- VidmUserGroupMgmtRestUtil::getUser - Successfully fetched user 2022-07-28 00:29:43.953 INFO [pool-3-thread-3] c.v.v.l.v.d.r.u.VidmUserGroupMgmtRestUtil - -- Response : VidmRestClientResponseDTO [statusCode=200, responseMessage={"totalResults":1,"itemsPerPage":1,"startIndex":1,"schemas":["urn:scim:schemas:core:1.0","urn:scim:schemas:extension:workspace:1.0","urn:scim:schemas:extension:enterprise:1.0","urn:scim:schemas:extension:workspace:mfa:1.0"],"Resources":[{"active":true,"userName":"configadmin","id":"a58bddc6-314b-4220-9ba4-cea4d9c5dbff","meta":{"created":"2021-12-06T10:05:21.025Z","lastModified":"2021-12-06T10:05:21.992Z","location":"https://idm.cap.org/SAAS/jersey/manager/api/scim/Users/a58bddc6-314b-4220-9ba4-cea4d9c5dbff","version":"W/\"1638785121992\""},"name":{"givenName":"configadmin","familyName":"configadmin"},"emails":[{"value":"configadmin@vsphere.local"}],"groups":[{"value":"237386ee-7f61-4d3a-93fa-1569d4bf673a","type":"direct","display":"ALL USERS"}],"roles":[{"value":"84a56b68-f8d5-4b9e-a365-92ef2adb3fb3","display":"User"},{"value":"55048dee-fe1b-404a-936d-3e0b86a7209e","display":"Administrator"}],"urn:scim:schemas:extension:workspace:1.0":{"internalUserType":"LOCAL","userStatus":"1","domain":"System Domain","userStoreUuid":"2e0fefa1-077e-423a-8095-c5f46b0d1827"}}]}] 2022-07-28 00:29:43.960 INFO [pool-3-thread-3] c.v.v.l.v.c.t.VidmSuperUserAvailabilityCheckTask - -- vIDM search user. Number of users returned : 1 2022-07-28 00:29:43.960 INFO [pool-3-thread-3] c.v.v.l.v.c.t.VidmSuperUserAvailabilityCheckTask - -- Searching for configadmin in the response ... 2022-07-28 00:29:43.960 INFO [pool-3-thread-3] c.v.v.l.v.c.t.VidmSuperUserAvailabilityCheckTask - -- User : configadmin 2022-07-28 00:29:43.960 INFO [pool-3-thread-3] c.v.v.l.v.c.t.VidmSuperUserAvailabilityCheckTask - -- User configadmin found to be available in vIDM 2022-07-28 00:29:43.961 INFO [pool-3-thread-3] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmSuperUserAvailabilitySuccess 17. vRA VA set vIDM task starts 2022-07-28 00:29:44.405 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVidmSuperUserAvailabilitySuccess 2022-07-28 00:29:44.406 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmSuperUserAvailabilityCheckTask 2022-07-28 00:29:44.406 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaSetVidmTask 2022-07-28 00:29:44.412 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaSetVidmTask 2022-07-28 00:29:44.490 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:29:44.492 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:44.496 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:44.500 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:44.503 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:29:44.504 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:29:44.513 INFO [pool-3-thread-10] c.v.v.l.p.c.v.t.VraVaSetVidmTask - -- Starting :: Set vRA VA VIDM Task 2022-07-28 00:29:44.515 INFO [pool-3-thread-10] c.v.v.l.p.c.v.t.VraVaSetVidmTask - -- Trying to set vIDM with root certificate 2022-07-28 00:29:44.515 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:44.516 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: rm /tmp/adminpassword.txt 2022-07-28 00:29:44.596 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> rm /tmp/adminpassword.txt 2022-07-28 00:29:45.145 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 1 2022-07-28 00:29:45.145 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:45.145 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 1, "outputData" : "", "errorData" : "rm: cannot remove '/tmp/adminpassword.txt': YXYXYXYX such file or directory\n", "commandTimedOut" : false } * * * 2022-07-28 00:29:45.147 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:45.147 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: rm /tmp/vidmrootcert.pem 2022-07-28 00:29:45.216 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> rm /tmp/vidmrootcert.pem 2022-07-28 00:29:45.764 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 1 2022-07-28 00:29:45.764 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:45.765 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 1, "outputData" : "", "errorData" : "rm: cannot remove '/tmp/vidmrootcert.pem': KXKXKXKX such file or directory\n", "commandTimedOut" : false } 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 1 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:45.766 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- rm: cannot remove '/tmp/vidmrootcert.pem': KXKXKXKX such file or directory 2022-07-28 00:29:45.767 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:45.767 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:45.767 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: echo 'MXMXMXMX' > /tmp/adminpassword.txt;history YXYXYXYX 1) 2022-07-28 00:29:45.874 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> echo 'MXMXMXMX' > /tmp/adminpassword.txt;history YXYXYXYX 1) 2022-07-28 00:29:46.424 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 2 2022-07-28 00:29:46.425 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:46.426 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 2, "outputData" : "", "errorData" : "bash: line 0: history: -d: option requires an argument\nhistory: usage: history [-c] [-d offset] [n] or history -anrw [filename] or history -ps arg [arg...]\n", "commandTimedOut" : false } 2022-07-28 00:29:46.426 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 2 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:46.427 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- bash: line 0: history: -d: option requires an argument history: usage: history [-c] [-d offset] [n] or history -anrw [filename] or history -ps arg [arg...] 2022-07-28 00:29:46.428 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:46.428 INFO [pool-3-thread-10] c.v.v.l.u.CertificateUtil - -- requestUrl : https://idm.cap.org 2022-07-28 00:29:46.494 INFO [pool-3-thread-10] c.v.v.l.c.l.MaskingPrintStream - -- * SYSOUT/SYSERR CAPTURED: -- [ [0] Version: 3 SerialNumber: 1638415830509 IssuerDN: CN=vRealize Suite Lifecycle Manager Locker CA,O=VMware,C=IN Start Date: Thu Dec 02 03:30:30 UTC 2021 Final Date: Sun Nov 30 03:30:30 UTC 2031 SubjectDN: CN=vRealize Suite Lifecycle Manager Locker CA,O=VMware,C=IN * * * 2022-07-28 00:29:47.157 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 0 2022-07-28 00:29:47.157 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:47.158 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 0 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:47.159 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- null 2022-07-28 00:29:47.160 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:47.160 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command to be run : vracli vidm set https://idm.cap.org admin /tmp/adminpassword.txt YXYXYXYX -r /tmp/vidmrootcert.pem 2022-07-28 00:29:47.160 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:47.161 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: vracli vidm set https://idm.cap.org admin /tmp/adminpassword.txt YXYXYXYX -r /tmp/vidmrootcert.pem 2022-07-28 00:29:47.261 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> vracli vidm set https://idm.cap.org admin /tmp/adminpassword.txt YXYXYXYX -r /tmp/vidmrootcert.pem 2022-07-28 00:29:50.813 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 0 2022-07-28 00:29:50.889 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:50.890 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "2022-07-28 00:29:48,209 [INFO] Setting vIDM certificate from /tmp/vidmrootcert.pem\n2022-07-28 00:29:48,354 [INFO] Getting information about client: prelude-UyLAxiyMkl\n2022-07-28 00:29:48,476 [INFO] Updating vIDM client prelude-UyLAxiyMkl with grant_types: client_credentials and redirect URLs: None\n2022-07-28 00:29:48,676 [INFO] Getting information about client: prelude-user-elkli9TRYF\n2022-07-28 00:29:48,794 [INFO] Updating vIDM client prelude-user-elkli9TRYF with grant_types: refresh_token authorization_code password YXYXYXYX and redirect URLs: https://vra.cap.org/identity/api/core/authn/csp, https://idm.vra.cap.org/identity/api/core/authn/csp\n2022-07-28 00:29:49,027 [INFO] Getting information about tenant: IDM\n2022-07-28 00:29:50,509 [INFO] Restarting Identity service pod(s)\n", "errorData" : null, "commandTimedOut" : false } 2022-07-28 00:29:50.892 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 0 2022-07-28 00:29:50.895 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:50.895 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:50.895 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:48,209 [INFO] Setting vIDM certificate from /tmp/vidmrootcert.pem 2022-07-28 00:29:48,354 [INFO] Getting information about client: prelude-UyLAxiyMkl 2022-07-28 00:29:48,476 [INFO] Updating vIDM client prelude-UyLAxiyMkl with grant_types: client_credentials and redirect URLs: None 2022-07-28 00:29:48,676 [INFO] Getting information about client: prelude-user-elkli9TRYF 2022-07-28 00:29:48,794 [INFO] Updating vIDM client prelude-user-elkli9TRYF with grant_types: refresh_token authorization_code password YXYXYXYX and redirect URLs: https://vra.cap.org/identity/api/core/authn/csp, https://idm.vra.cap.org/identity/api/core/authn/csp 2022-07-28 00:29:49,027 [INFO] Getting information about tenant: IDM 2022-07-28 00:29:50,509 [INFO] Restarting Identity service pod(s) 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- null 2022-07-28 00:29:50.896 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:50.897 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Set VIDM Host successful on :: vra.cap.org 2022-07-28 00:29:50.898 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:50.898 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: rm /tmp/adminpassword.txt 2022-07-28 00:29:51.298 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> rm /tmp/adminpassword.txt 2022-07-28 00:29:51.844 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 0 2022-07-28 00:29:51.844 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:51.845 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 0 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:51.846 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:51.847 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:51.847 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- null 2022-07-28 00:29:51.847 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:51.847 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:51.847 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: rm /tmp/vidmrootcert.pem 2022-07-28 00:29:52.277 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Executing command --> rm /tmp/vidmrootcert.pem 2022-07-28 00:29:52.828 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- exit-status: 0 2022-07-28 00:29:52.828 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command executed sucessfully 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.u.SshUtils - -- Command execution response: { "exitStatus" : 0, "outputData" : "", "errorData" : null, "commandTimedOut" : false } 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Command Status code :: 0 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Output Stream :: 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- Error Stream :: 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- null 2022-07-28 00:29:52.829 INFO [pool-3-thread-10] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- ==================================================== 2022-07-28 00:29:52.830 INFO [pool-3-thread-10] c.v.v.l.p.c.v.t.VraVaSetVidmTask - -- Setting VIDM for vRA completed with status : true 2022-07-28 00:29:52.830 INFO [pool-3-thread-10] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnSetVIDMVaCompletion * * * 2022-07-28 00:29:52.834 INFO [pool-3-thread-10] c.v.v.l.p.a.s.Task - -- FIELD NAME :: componentSpec 2022-07-28 00:29:52.834 INFO [pool-3-thread-10] c.v.v.l.p.a.s.Task - -- KEY PICKER IS :: com.vmware.vrealize.lcm.plugin.core.vra80.tasks.keypicker.VravaComponentSpecKeyPicker 2022-07-28 00:29:53.385 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "1cb3e351-4235-4824-8fb5-4cbc9c4a80d2", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658968192835, "lastUpdatedOn" : 1658968193334, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnSetVIDMVaCompletion", "currentState" : null, "eventArgument" : "{\"productSpec\":{\"name\":\"productSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ProductSpecification\",\"value\":\"null\"},\"componentSpec\":{\"name\":\"componentSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ComponentDeploymentSpecification\",\"value\":\"{\\\"component\\\":{\\\"symbolicName\\\":\\\"vravaretrustvidm\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"cafeHostNamePrimary\\\":\\\"vra.cap.org\\\",\\\"cafeRootPasswordPrimary\\\":\\\"JXJXJXJX\\\",\\\"vidmPrimaryNodeRootPassword\\\":\\\"JXJXJXJX\\\",\\\"baseTenantId\\\":\\\"idm\\\",\\\"uberAdminUserType\\\":\\\"LOCAL\\\",\\\"version\\\":\\\"8.8.1\\\",\\\"masterVidmAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"uberAdmin\\\":\\\"configadmin\\\",\\\"masterVidmEnabled\\\":\\\"true\\\",\\\"__version\\\":\\\"8.8.1\\\",\\\"uberAdminPassword\\\":\\\"JXJXJXJX\\\",\\\"masterVidmHostName\\\":\\\"idm.cap.org\\\",\\\"masterVidmAdminUserName\\\":\\\"admin\\\",\\\"isLBSslTerminated\\\":\\\"false\\\",\\\"authProviderHostnames\\\":\\\"idm.cap.org\\\",\\\"vidmPrimaryNodeHostname\\\":\\\"idm.cap.org\\\"}},\\\"priority\\\":KXKXKXKX\"}}", "status" : "CREATED", "stateMachineInstance" : "ba1d3e0f-6c87-4e29-bca6-c347dd9bda6b", "errorCause" : null, "sequence" : 563581, "eventLock" : 1, "engineNodeId" : "lcm.cap.org" } * * * 2022-07-28 00:29:53.501 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnSetVIDMVaCompletion 18. Set load balancer task is initiated and completed. Because 2022-07-28 00:29:53.501 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaSetVidmTask 2022-07-28 00:29:53.501 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaSetLoadBalancerTask 2022-07-28 00:29:53.603 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaSetLoadBalancerTask 2022-07-28 00:29:53.869 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:29:53.871 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:29:53.872 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:53.875 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:53.877 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:53.880 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: configurationPropertyService 2022-07-28 00:29:53.880 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:29:53.881 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:29:54.005 INFO [pool-3-thread-47] c.v.v.l.p.c.v.t.VraVaSetLoadBalancerTask - -- Starting :: vRA Set LB Task.... 2022-07-28 00:29:54.130 INFO [pool-3-thread-47] c.v.v.l.p.c.v.t.u.VraVaTaskUtil - -- vRA Configuration property setLoadBalancerCertificate value is obtained from Config Service : true 2022-07-28 00:29:54.130 INFO [pool-3-thread-47] c.v.v.l.p.c.v.t.VraVaSetLoadBalancerTask - -- LB information not provided skipping LB 2022-07-28 00:29:54.130 INFO [pool-3-thread-47] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnLoadBalancerSetCompletion 19. vRA VA initialize task starts which starts the services or runs deploy.sh to initialize pods 2022-07-28 00:29:54.641 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaInitializeTask 2022-07-28 00:29:54.787 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaInitializeTask 2022-07-28 00:29:54.866 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:29:54.868 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:29:54.869 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:54.873 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:54.876 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:29:54.881 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:29:54.881 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:29:54.932 INFO [pool-3-thread-7] c.v.v.l.p.c.v.t.VraVaInitializeTask - -- Starting :: Initialize vRA VA Task 2022-07-28 00:29:54.960 INFO [pool-3-thread-7] c.v.v.l.p.c.v.t.VraVaInitializeTask - -- isCavaDeployment :false deployOptions: null 2022-07-28 00:29:54.967 INFO [pool-3-thread-7] c.v.v.l.p.c.v.t.VraVaInitializeTask - -- Running Deploy Script on vRA VA : vra.cap.org 2022-07-28 00:29:54.967 INFO [pool-3-thread-7] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- PRELUDE ENDPOINT HOST :: vra.cap.org 2022-07-28 00:29:54.989 INFO [pool-3-thread-7] c.v.v.l.d.v.h.VraPreludeInstallHelper - -- COMMAND :: /opt/scripts/deploy.sh 2022-07-28 00:29:55.187 INFO [pool-3-thread-7] c.v.v.l.u.SshUtils - -- Executing command --> /opt/scripts/deploy.sh 20. This is the time in vRA , you would see the pods are redeployed ========================= [2022-07-28 00:29:55.456+0000] Waiting for deploy healthcheck ========================= ========================= [2022-07-28 00:29:57.740+0000] Backing up databases from existing pods ========================= ========================= [2022-07-28 00:30:03.806+0000] Waiting for command execution pods ========================= ========================= [2022-07-28 00:30:12.060+0000] Tear down existing deployment ========================= ========================= [2022-07-28 00:34:13.878+0000] Creating kubernetes namespaces ========================= ========================= [2022-07-28 00:34:22.435+0000] Applying ingress certificate ========================= ========================= [2022-07-28 00:34:28.153+0000] Updating etcd configuration to include https_proxy if such exists ========================= ========================= [2022-07-28 00:34:38.749+0000] Deploying infrastructure services ========================= ========================= + set +x [2022-07-28 00:34:47.336+0000] Creating database pods in previous mode if necessary for migration ========================= ========================= [2022-07-28 00:36:41.814+0000] Clearing liquibase locks ========================= ========================= [2022-07-28 00:37:06.654+0000] DB upgrade schema ========================= [2022-07-28 00:39:19.814+0000] Populating initial identity-service data ========================= ========================= [2022-07-28 00:39:57.218+0000] Deploying application services ========================= ========================= [2022-07-28 00:52:13.054+0000] Deploying application UI ========================= ========================= [2022-07-28 00:53:47.981+0000] Setting feature UI toggles to Provisioning service ========================= ========================= [2022-07-28 00:54:02.647+0000] Check embedded endpoints ========================= Prelude has been deployed successfully ========================= 21. LCM reports that the vRA VA services has started 2022-07-28 00:54:08.540 INFO [pool-3-thread-7] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnInitilizeVaCompletion 22. Now , it starts vRA VA Update Certificate Inventory task 2022-07-28 00:54:09.056 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnInitilizeVaCompletion 2022-07-28 00:54:09.056 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaInitializeTask 2022-07-28 00:54:09.057 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaUpdateCertificateInInventoryTask 2022-07-28 00:54:09.065 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaUpdateCertificateInInventoryTask 2022-07-28 00:54:09.106 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:54:09.107 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:54:09.108 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.114 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.117 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.118 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:54:09.120 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:54:09.134 INFO [pool-3-thread-34] c.v.v.l.p.c.v.t.VraVaUpdateCertificateInInventoryTask - -- Starting :: vRA Certificate Inventory Update Task.... 2022-07-28 00:54:09.136 INFO [pool-3-thread-34] c.v.v.l.u.CertificateUtil - -- requestUrl : https://vra.cap.org 2022-07-28 00:54:09.236 INFO [pool-3-thread-34] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVravaCertificateInventoryCompletion 23. Update all allowed redirects task is triggered. These are the links available in the products when a user logs into vIDM 2022-07-28 00:54:09.612 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Responding for Edge :: OnVravaCertificateInventoryCompletion 2022-07-28 00:54:09.612 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.core.vra80.task.VraVaUpdateCertificateInInventoryTask 2022-07-28 00:54:09.612 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask 2022-07-28 00:54:09.620 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask 2022-07-28 00:54:09.665 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: productSpec 2022-07-28 00:54:09.667 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Locker Object :: componentSpec 2022-07-28 00:54:09.669 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: cafeRootPasswordPrimary<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.674 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: uberAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.677 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: vidmPrimaryNodeRootPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.680 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- KEY_VARIABLE :: masterVidmAdminPassword<=KXKXKXKX KEY :: XXXXXX 2022-07-28 00:54:09.683 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-28 00:54:09.683 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-28 00:54:09.697 INFO [pool-3-thread-33] c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask - -- Starting vIDM update allowed redirects 2022-07-28 00:54:09.699 INFO [pool-3-thread-33] c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask - -- vRA hostname : vra.cap.org 2022-07-28 00:54:09.699 INFO [pool-3-thread-33] c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask - -- Redirects supposed to added : https://vra.cap.org* 2022-07-28 00:54:09.904 INFO [pool-3-thread-33] c.v.v.l.v.d.r.c.VidmRestClient - -- API Response Status : 200 Response Message : {"allowedRedirects":["https://lcm.cap.org*","https://vra.cap.org*","https://migvra.cap.org*","https://tvra.cap.org*","https://hvra.cap.org*","https://nusvra.cap.org *","https://testvra.cap.org*"],"_links":{"self":{"href":"/SAAS/jersey/manager/api/authsettings/allowedredirects"}}} 2022-07-28 00:54:09.908 INFO [pool-3-thread-33] c.v.v.l.v.d.r.u.VidmServerRestUtil - -- Response for GET allowed redirects : VidmRestClientResponseDTO [statusCode=200, responseMessage={"allowedRedirects":["https://lcm.cap.org*","https://vra.cap.org*","https://migvra.cap.org*","https://tvra.ca p.org*","https://hvra.cap.org*","https://nusvra.cap.org*","https://testvra.cap.org*"],"_links":{"self":{"href":"/SAAS/jersey/manager/api/authsettings/allowedredirects"}}}] 2022-07-28 00:54:09.908 INFO [pool-3-thread-33] c.v.v.l.v.d.r.u.VidmServerRestUtil - -- Response for GET allowed redirects : VidmRestClientResponseDTO [statusCode=200, responseMessage={"allowedRedirects":["https://lcm.cap.org*","https://vra.cap.org*","https://migvra.cap.org*","https://tvra.ca p.org*","https://hvra.cap.org*","https://nusvra.cap.org*","https://testvra.cap.org*"],"_links":{"self":{"href":"/SAAS/jersey/manager/api/authsettings/allowedredirects"}}}] 2022-07-28 00:54:09.914 INFO [pool-3-thread-33] c.v.v.l.v.d.r.u.VidmServerRestUtil - -- Allowed redirects on the vIDM :: [https://lcm.cap.org*, https://vra.cap.org*, https://migvra.cap.org*, https://tvra.cap.org*, https://hvra.cap.org*, https://nusvra.cap.org*, https://testvra.cap.org*] 2022-07-28 00:54:09.914 INFO [pool-3-thread-33] c.v.v.l.v.d.r.u.VidmServerRestUtil - -- Skipping updating allowed redirect URLs, as all the required URLs already exists on the vIDM :: idm.cap.org 2022-07-28 00:54:09.914 INFO [pool-3-thread-33] c.v.v.l.v.c.t.VidmUpdateAllowedRedirectsTask - -- vIDM update allowed redirects task done 2022-07-28 00:54:09.915 INFO [pool-3-thread-33] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVidmUpdateAllowRedirectSuccess 24. Finishes the task 2022-07-28 00:54:10.195 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.vidm.core.task.VidmUpdateAllowedRedirectsTask 2022-07-28 00:54:10.195 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask 2022-07-28 00:54:10.203 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.platform.automata.service.task.FinalTask My vIDM and vRA were standalone and not clustered , will find some time and test this on cluster too. Nothing would change really apart from the number of nodes. Steps would remain same

  • Unable to generate vRSLCM log bundle through UI

    There was a recent scenario where vRSLCM log generation was not happening through UI So let's take a minute to see what happens when you generate a log bundle and then we can inspect what could have gone wrong this the previous case where it wasn't generating one. When I click on generate log bundle .... It generates a request , upgrade planner spec and then the engine request . What's important here is the downloadUrl is pointing to the previously generated log bundle. 2022-07-27 09:25:15.649 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:25:15.660 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : loginsightsetting 2022-07-27 09:25:53.795 INFO [http-nio-8080-exec-9] c.v.v.l.l.c.SettingsController - -- Validation result for Setting :logbundledownload result true 2022-07-27 09:25:53.846 INFO [http-nio-8080-exec-9] c.v.v.l.l.c.SettingsController - -- Setting Value before save: "{\n \"downloadUrl\" : \"https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip\"\n}" 2022-07-27 09:25:53.926 INFO [http-nio-8080-exec-9] c.v.v.l.l.u.RequestSubmissionUtil - -- ++++++++++++++++++ Creating request to Request_Service :::>>> { "vmid" : "2d03775b-9f49-4982-879b-26195ed74cac", "transactionId" : null, "tenant" : "default", "requestName" : "lcmsupportbundle", "requestReason" : "Get vRSLCM Support Bundle", "requestType" : "lcmsupportbundle", "requestSource" : null, "requestSourceType" : "user", "inputMap" : { "downloadUrl" : "https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip" }, "outputMap" : { }, "state" : "CREATED", "executionId" : null, "executionPath" : null, "executionStatus" : null, "errorCause" : null, "resultSet" : null, "isCancelEnabled" : null, "lastUpdatedOn" : 1658913953925, "createdBy" : null } 2022-07-27 09:25:54.021 INFO [scheduling-1] c.v.v.l.r.c.p.GenericEnvironmentPlanner - -- Generic Planner SPEC :: { "vmid" : "117d667d-f8d6-4602-9d97-874f7b4fd39d", "tenant" : "default", "originalRequest" : null, "enhancedRequest" : null, "symbolicName" : "90e136b4-4d83-4c61-ab81-7378fdf324ae", "acceptEula" : true, "variables" : { }, "products" : [ { "symbolicName" : "lcmsupportbundle", "displayName" : null, "productVersion" : null, "priority" : 0, "dependsOn" : [ ], "components" : [ { "component" : { "symbolicName" : "lcmsupportbundle", "type" : null, "componentVersion" : null, "properties" : { "downloadUrl" : "https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip", "isVcfUser" : "false" } }, "priority" : 0 } ] } ] } 2022-07-27 09:25:54.027 INFO [scheduling-1] c.v.v.l.r.c.RequestProcessor - -- ENGINE REQUEST :: { "vmid" : "117d667d-f8d6-4602-9d97-874f7b4fd39d", "tenant" : "default", "originalRequest" : null, "enhancedRequest" : null, "symbolicName" : "90e136b4-4d83-4c61-ab81-7378fdf324ae", "acceptEula" : true, "variables" : { }, "products" : [ { "symbolicName" : "lcmsupportbundle", "displayName" : null, "productVersion" : null, "priority" : 0, "dependsOn" : [ ], "components" : [ { "component" : { "symbolicName" : "lcmsupportbundle", "type" : null, "componentVersion" : null, "properties" : { "downloadUrl" : "https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip", "isVcfUser" : "false" } }, "priority" : 0 } ] } ] } The actual or current log bundle generation starts 2022-07-27 09:25:55.157 INFO [scheduling-1] c.v.v.l.a.c.FlowProcessor - -- Injected OnStart Edge for the Machine ID :: lcmsupportbundle 2022-07-27 09:25:55.201 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- INITIALIZING NEW EVENT :: { "vmid" : "2036f6ef-a34c-4c66-a414-671a403057f0", "transactionId" : null, "tenant" : "default", "createdBy" : "root", "lastModifiedBy" : "root", "createdOn" : 1658913955155, "lastUpdatedOn" : 1658913955182, "version" : "8.1.0.0", "vrn" : null, "eventName" : "OnStart", "currentState" : null, "eventArgument" : "{\"productSpec\":{\"name\":\"productSpec\",\"type\":\"com.vmware.vrealize.lcm.domain.ProductSpecification\",\"value\":\"{\\\"symbolicName\\\":\\\"lcmsupportbundle\\\",\\\"displayName\\\":null,\\\"productVersion\\\":null,\\\"priority\\\":0,\\\"dependsOn\\\":[],\\\"components\\\":[{\\\"component\\\":{\\\"symbolicName\\\":\\\"lcmsupportbundle\\\",\\\"type\\\":null,\\\"componentVersion\\\":null,\\\"properties\\\":{\\\"downloadUrl\\\":\\\"https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip\\\",\\\"isVcfUser\\\":\\\"false\\\"}},\\\"priority\\\":0}]}\"}}", "status" : "CREATED", "stateMachineInstance" : "9b69126f-58e7-47de-afa7-3bf907dd6756", "errorCause" : null, "sequence" : 560741, "eventLock" : 1, "engineNodeId" : "lcm.cap.org" } 2022-07-27 09:25:55.209 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- GETTING MACHINE FOR THE KEY :: lcmsupportbundle 2022-07-27 09:25:55.209 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- QUERYING CONTENT :: SystemFlowInventory::flows::flow->lcmsupportbundle 2022-07-27 09:25:55.210 INFO [scheduling-1] c.v.v.l.d.i.u.InventorySchemaQueryUtil - -- GETTING ROOT NODE FOR :: SystemFlowInventory 2022-07-27 09:25:55.228 INFO [scheduling-1] c.v.v.l.a.c.MachineRegistry - -- URL :: /system/flow/lcmsupportbundle.vmfx 2022-07-27 09:25:55.229 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- INSIDE ContentDownloadControllerImpl 2022-07-27 09:25:55.229 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- REPO_NAME :: /systemflowrepo 2022-07-27 09:25:55.229 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- CONTENT_PATH :: /system/flow/lcmsupportbundle.vmfx 2022-07-27 09:25:55.229 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- URL :: /systemflowrepo/system/flow/lcmsupportbundle.vmfx 2022-07-27 09:25:55.229 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- Decoded URL :: /systemflowrepo/system/flow/lcmsupportbundle.vmfx 2022-07-27 09:25:55.230 INFO [scheduling-1] c.v.v.l.c.c.ContentDownloadController - -- ContentDTO{BaseDTO{vmid='lcmsupportbundle', version=8.1.0.0} -> repoName='systemflowrepo', contentState='PUBLISHED', url='/systemflowrepo/system/flow/lcmsupportbundle.vmfx'} 2022-07-27 09:25:55.231 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- State to find :: com.vmware.vrealize.lcm.plugin.lcmplugin.core.task.LcmSupportBundleTask 2022-07-27 09:25:55.235 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Invoking Task :: com.vmware.vrealize.lcm.plugin.lcmplugin.core.task.LcmSupportBundleTask 2022-07-27 09:25:55.502 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: contentRepositoryController 2022-07-27 09:25:55.504 INFO [scheduling-1] c.v.v.l.a.c.EventProcessor - -- Injecting Bean :: settingsController 2022-07-27 09:25:55.504 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Start Instrumenting EventMetadata. 2022-07-27 09:25:55.506 INFO [scheduling-1] c.v.v.l.c.u.EventExecutionTelemetryUtil - -- Stop Instrumenting EventMetadata. 2022-07-27 09:25:55.511 INFO [pool-3-thread-14] c.v.v.l.p.l.c.t.LcmSupportBundleTask - -- Starting :: LCM VA support Bundle Task 2022-07-27 09:25:55.528 INFO [pool-3-thread-14] c.v.v.l.u.ShellExecutor - -- Executing shell command: /var/lib/vlcm-common/vlcm-support -w /data/lcm-logbundle 2022-07-27 09:25:55.530 INFO [pool-3-thread-14] c.v.v.l.u.ProcessUtil - -- Execute /var/lib/vlcm-common/vlcm-support 2022-07-27 09:25:58.091 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:25:58.099 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:02.098 INFO [http-nio-8080-exec-10] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:02.098 INFO [http-nio-8080-exec-10] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:06.092 INFO [http-nio-8080-exec-2] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:06.143 INFO [http-nio-8080-exec-2] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload * * * * 2022-07-27 09:26:10.097 INFO [http-nio-8080-exec-1] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:10.097 INFO [http-nio-8080-exec-1] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:14.101 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:14.102 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:18.105 INFO [http-nio-8080-exec-3] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:18.173 INFO [http-nio-8080-exec-3] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:22.096 INFO [http-nio-8080-exec-7] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:22.097 INFO [http-nio-8080-exec-7] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:26.104 INFO [http-nio-8080-exec-5] c.v.v.l.l.c.SettingsController - -- queryParams : { } 2022-07-27 09:26:26.104 INFO [http-nio-8080-exec-5] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:28.068 INFO [pool-3-thread-14] c.v.v.l.u.ShellExecutor - -- Result: [Support program for VMware vRealize LCM Appliance - Version 8.2.0 adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/auth.log (stored 0%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/boot.log (deflated 91%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/bootstrap/everyboot.log (deflated 96%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/bootstrap/firstboot.log (stored 0%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/bootstrap/postinstall.log (stored 0%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/bootstrap/postupdate.log (deflated 79%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/bootstrap/preupdate.log (deflated 45%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/cloud-init-output.log (deflated 92%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/cloud-init.log (deflated 90%) zip warning: name not matched: vlcmsupport-2022-07-27_09-25-55.16453/var/log/dracut.log zip error: Nothing to do! (vlcmsupport-2022-07-27_09-25-55.16453.zip) Warning..detected exception condition. Reason: 12 adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/installer-kickstart.log (stored 0%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/installer.log (stored 0%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liagent_2022-07-14_23.log (deflated 95%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2021-10-08_00.log (deflated 78%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2021-12-02_01.log (deflated 71%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2021-12-02_02.log (deflated 71%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2021-12-13_03.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-01-06_04.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-01-06_05.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-01-19_06.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-03-23_07.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-03-23_08.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-04-24_09.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-05-06_10.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-06-15_11.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-07-14_12.log (deflated 70%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/loginsight-agent/liupdater_2022-07-14_13.log (deflated 60%) adding: vlcmsupport-2022-07-27_09-25-55.16453/var/log/nginx/access.log zip warning: file size changed while zipping vlcmsupport-2022-07-27_09-25-55.16453/var/log/nginx/access.log (deflated 96%) * * * adding: vlcmsupport-2022-07-27_09-25-55.16453/tmp/service-status-all.txt (deflated 79%) adding: vlcmsupport-2022-07-27_09-25-55.16453/tmp/chkconfig-list.txt (deflated 43%) adding: vlcmsupport-2022-07-27_09-25-55.16453/tmp/ovfenv-D.16453.txt (deflated 77%) adding: vlcmsupport-2022-07-27_09-25-55.16453/tmp/ovfenv-xml.16453.txt (deflated 71%) adding: vlcmsupport-2022-07-27_09-25-55.16453/tmp/vlcm_build_details.txt (deflated 34%) File: /data/lcm-logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip]. 2022-07-27 09:26:28.835 INFO [pool-3-thread-14] c.v.v.l.l.u.SettingsHelper - -- Setting name : vcfmodesettings , Package name : com.vmware.vrealize.lcm.lcops.common.dto.settings.VcfConfigurationDTO 2022-07-27 09:26:28.848 INFO [pool-3-thread-14] c.v.v.l.l.u.SettingsHelper - -- Final Data : { "vcfHostname" : null, "username" : null, "apiKey" : null, "version" : null } 2022-07-27 09:26:28.856 INFO [pool-3-thread-14] c.v.v.l.p.l.c.t.LcmSupportBundleTask - -- Result archive is: '/data/lcm-logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip' 2022-07-27 09:26:28.856 INFO [pool-3-thread-14] c.v.v.l.p.l.c.t.LcmSupportBundleTask - -- Support bundle created successfully in path : /data/lcm-logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip 2022-07-27 09:26:28.886 INFO [pool-3-thread-14] c.v.v.l.p.a.s.Task - -- Injecting Edge :: OnVaSupportBundleCompletion 2022-07-27 09:26:28.900 INFO [pool-3-thread-14] c.v.v.l.p.l.c.t.LcmSupportBundleTask - -- old download url : https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-01-25_01-33-57.44075.zip 2022-07-27 09:26:28.912 INFO [pool-3-thread-14] c.v.v.l.c.c.ContentRepositoryController - -- Content delete requested for url /logBundleRepo/vrlcm/logbundlevlcmsupport-2022-01-25_01-33-57.44075.zip. 2022-07-27 09:26:28.943 INFO [pool-3-thread-14] c.v.v.l.c.c.ContentRepositoryController - -- Content delete is failed with given URL :: /logBundleRepo/vrlcm/logbundlevlcmsupport-2022-01-25_01-33-57.44075.zip 2022-07-27 09:26:29.056 INFO [pool-3-thread-14] c.v.v.l.c.c.ContentRepositoryController - -- Creating content operation. 2022-07-27 09:26:29.088 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- URL :: /logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip 2022-07-27 09:26:29.111 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- URL :: /logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip 2022-07-27 09:26:29.115 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH LENGTH :: 5 2022-07-27 09:26:29.116 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH LENGTH TEST PASSED 2022-07-27 09:26:29.123 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> logBundleRepo :: NAME -> __ROOT__ :KXKXKXKX PARENT -> da0c4d6f-d0de-4837-835b-6f0d6e01b15b 2022-07-27 09:26:29.249 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH :: 2022-07-27 09:26:29.250 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::logBundleRepo 2022-07-27 09:26:29.250 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::vrlcm 2022-07-27 09:26:29.251 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::logbundle 2022-07-27 09:26:29.251 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- PATH ::vlcmsupport-2022-07-27_09-25-55.16453.zip 2022-07-27 09:26:29.253 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- ADDING NODE - PATH LENGTH :: 5 2022-07-27 09:26:29.254 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> logBundleRepo :: PARENT -> 2761c604-c889-4c28-834a-23423133cbf1 :: NAME -> vrlcm 2022-07-27 09:26:29.259 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> logBundleRepo :: PARENT -> 3172ba88-32f7-4f07-ba92-407abbdb37d3 :: NAME -> logbundle 2022-07-27 09:26:29.262 INFO [pool-3-thread-14] c.v.v.l.c.s.ContentDownloadUrlServiceImpl - -- SEARCHINE FOR :: REPO -> logBundleRepo :: PARENT -> 024b449c-18cc-4dab-8e0b-9e9a47702453 :: NAME -> vlcmsupport-2022-07-27_09-25-55.16453.zip Finally marking the completion of log bundle collection The previous download url is replaced with the current or the recently generated bundle for download 2022-07-27 09:26:30.097 INFO [http-nio-8080-exec-9] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-27 09:26:31.852 INFO [pool-3-thread-14] c.v.v.l.p.l.c.t.LcmSupportBundleTask - -- finalUrl to download log bundle : https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip 2022-07-27 09:26:34.091 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload In the failed scenario , what we see if the below message being repeating always , just does not move forward 2022-07-21 21:05:35.937 INFO [http-nio-8080-exec-4] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:39.932 INFO [http-nio-8080-exec-11] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:43.934 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:47.940 INFO [http-nio-8080-exec-12] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:51.935 INFO [http-nio-8080-exec-10] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:55.943 INFO [http-nio-8080-exec-3] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:05:59.940 INFO [http-nio-8080-exec-8] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:03.929 INFO [http-nio-8080-exec-2] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:07.938 INFO [http-nio-8080-exec-5] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:11.938 INFO [http-nio-8080-exec-9] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:15.932 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:19.938 INFO [http-nio-8080-exec-12] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:23.928 INFO [http-nio-8080-exec-10] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:27.934 INFO [http-nio-8080-exec-1] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:31.930 INFO [http-nio-8080-exec-7] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:35.938 INFO [http-nio-8080-exec-2] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:39.939 INFO [http-nio-8080-exec-5] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:43.928 INFO [http-nio-8080-exec-11] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:47.938 INFO [http-nio-8080-exec-9] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:51.928 INFO [http-nio-8080-exec-6] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:55.938 INFO [http-nio-8080-exec-12] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload 2022-07-21 21:06:59.931 INFO [http-nio-8080-exec-3] c.v.v.l.l.c.SettingsController - -- Dynamic Setting data required for setting : logbundledownload As per data from the database Problematic Environment vrlcm=# select * from vm_lcops_settings where name = 'logbundledownload'; -[ RECORD 1 ]-------+----------------------------------------------------------------------------------------------------------------------------- vmid | b8******e0 createdby | serviceadmin@local createdon | 1638384524109 lastmodifiedby | serviceadmin@local lastupdatedon | 1638384524109 tenant | default version | 8.1.0.0 vrn | additionaldata | datatype | Request description | Trigger Log bundle download detaileddescription | Triggers log collection for all nodes in the environment. Returns request id that can be used to check log collection status name | logbundledownload packagename | value | { + | "requestid" : "28****f0" | } My Lab vrlcm=# select * from vm_lcops_settings where name = 'logbundledownload'; -[ RECORD 1 ]-------+----------------------------------------------------------------------------------------------------------------------------- vmid | 9b675941-a97e-4b66-9f4c-4f0391a77017 createdby | serviceadmin@local createdon | 1638415924109 lastmodifiedby | serviceadmin@local lastupdatedon | 1638415924109 tenant | default version | 8.1.0.0 vrn | additionaldata | datatype | Request description | Trigger Log bundle download detaileddescription | Triggers log collection for all nodes in the environment. Returns request id that can be used to check log collection status name | logbundledownload packagename | value | { + | "downloadUrl" : "https://lcm.cap.org/repo/logBundleRepo/vrlcm/logbundle/vlcmsupport-2022-07-27_09-25-55.16453.zip" + | } The difference is the wrong value in the table vm_lcops_settings table where logbundledownload record has a wrong value set to it. Instead of having a downloadUrl it has a request id set in it. Remediation Plan Take a snapshot of vRSLCM Execute below query to remove existing value update vm_lcops_settings set value = '' where name = 'logbundledownload'; Restart vRSLCM systemctl restart vrslcm-server Login into the UI Generate Log Bundle One can even check from API if the value is set correctly Method: GET URL: {{lcmurl}}/lcm/lcops/api/settings/logbundledownload Hope this helps you to understand what happens in the background and when a failure is seen how do you remediate it

  • Upgrade Planner Controller API's

    There are 2 API's what Upgrade Planner Controller provides Let's explore these API's and see how we can fetch our desired result getVrsProductsVersions Inputs Method: GET URL: {{lcmurl}}/lcm/lcops/api/v2/getTargetVersions Content Type: application/json screenshot showUpgradePath Inputs Method: POST URL: {{lcmurl}}/lcm/lcops/api/v2/findUpgradePath Content Type: application/json upgradePathInputs: [ { "productId": "string", "fromVersion": "string", "toVersion": "string", "required": true } ] screenshot Sample Body [ { "productId": "vrslcm", "fromVersion": "8.8.2", "toVersion": "8.8.2", "required": false }, { "productId": "vra", "fromVersion": "8.8.1", "toVersion": "8.8.2", "required": true }, { "productId": "vrli", "fromVersion": "8.6.2", "toVersion": "8.8.2", "required": true }, { "productId": "vrops", "fromVersion": "8.6.2", "toVersion": "8.6.3", "required": true }, { "productId": "vrni", "fromVersion": "6.6.0", "toVersion": "6.7.0", "required": true }, { "productId": "vidm", "fromVersion": "3.3.6", "toVersion": "3.3.6", "required": true } ]

  • Installing Jenkins

    Use-Case Documenting steps needed to install Jenkins on an Ubuntu Virtual Machine and then integrating this to vRealize Automation to trigger some pipelines Procedure Deploy a Ubuntu virtual machine ( either on-prem or cloud ) Update the Debian apt repositories, install OpenJDK 11 root@jenkins:~# sudo apt update Hit:1 http://in.archive.ubuntu.com/ubuntu focal InRelease Get:2 http://in.archive.ubuntu.com/ubuntu focal-updates InRelease [114 kB] Get:3 http://in.archive.ubuntu.com/ubuntu focal-backports InRelease [108 kB] Get:4 http://in.archive.ubuntu.com/ubuntu focal-security InRelease [114 kB] Get:5 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [1,793 kB] Get:6 http://in.archive.ubuntu.com/ubuntu focal-updates/main Translation-en [330 kB] Get:7 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 c-n-f Metadata [15.2 kB] Get:8 http://in.archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [976 kB] Get:9 http://in.archive.ubuntu.com/ubuntu focal-updates/restricted Translation-en [139 kB] Get:10 http://in.archive.ubuntu.com/ubuntu focal-updates/restricted amd64 c-n-f Metadata [520 B] Get:11 http://in.archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [924 kB] Get:12 http://in.archive.ubuntu.com/ubuntu focal-updates/universe Translation-en [207 kB] Get:13 http://in.archive.ubuntu.com/ubuntu focal-updates/universe amd64 c-n-f Metadata [20.7 kB] Get:14 http://in.archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [24.4 kB] Get:15 http://in.archive.ubuntu.com/ubuntu focal-updates/multiverse Translation-en [7,336 B] Get:16 http://in.archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 c-n-f Metadata [596 B] Get:17 http://in.archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [68.1 kB] Get:18 http://in.archive.ubuntu.com/ubuntu focal-backports/main Translation-en [10.9 kB] Get:19 http://in.archive.ubuntu.com/ubuntu focal-backports/main amd64 c-n-f Metadata [980 B] Get:20 http://in.archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [26.8 kB] Get:21 http://in.archive.ubuntu.com/ubuntu focal-backports/universe Translation-en [15.9 kB] Get:22 http://in.archive.ubuntu.com/ubuntu focal-backports/universe amd64 c-n-f Metadata [860 B] Get:23 http://in.archive.ubuntu.com/ubuntu focal-security/main amd64 Packages [1,453 kB] Get:24 http://in.archive.ubuntu.com/ubuntu focal-security/main Translation-en [250 kB] Get:25 http://in.archive.ubuntu.com/ubuntu focal-security/main amd64 c-n-f Metadata [10.2 kB] Get:26 http://in.archive.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [914 kB] Get:27 http://in.archive.ubuntu.com/ubuntu focal-security/restricted Translation-en [130 kB] Get:28 http://in.archive.ubuntu.com/ubuntu focal-security/restricted amd64 c-n-f Metadata [520 B] Get:29 http://in.archive.ubuntu.com/ubuntu focal-security/universe amd64 Packages [703 kB] Get:30 http://in.archive.ubuntu.com/ubuntu focal-security/universe Translation-en [125 kB] Get:31 http://in.archive.ubuntu.com/ubuntu focal-security/universe amd64 c-n-f Metadata [14.4 kB] Get:32 http://in.archive.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [22.2 kB] Get:33 http://in.archive.ubuntu.com/ubuntu focal-security/multiverse Translation-en [5,376 B] Get:34 http://in.archive.ubuntu.com/ubuntu focal-security/multiverse amd64 c-n-f Metadata [512 B] Fetched 8,527 kB in 2s (3,810 kB/s) Reading package lists... Done Building dependency tree Reading state information... Done 150 packages can be upgraded. Run 'apt list --upgradable' to see them. root@jenkins:~# sudo apt install openjdk-11-jre Reading package lists... Done Building dependency tree Reading state information... Done The following additional packages will be installed: at-spi2-core ca-certificates-java fontconfig-config fonts-dejavu-core fonts-dejavu-extra java-common libatk-bridge2.0-0 libatk-wrapper-java libatk-wrapper-java-jni libatk1.0-0 libatk1.0-data libatspi2.0-0 libavahi-client3 libavahi-common-data libavahi-common3 libcups2 libdrm-amdgpu1 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libfontconfig1 libfontenc1 libgif7 libgl1 libgl1-mesa-dri libglapi-mesa libglvnd0 libglx-mesa0 libglx0 libgraphite2-3 libharfbuzz0b libice6 libjpeg-turbo8 libjpeg8 liblcms2-2 libllvm12 libpciaccess0 libpcsclite1 libsensors-config libsensors5 libsm6 libvulkan1 libwayland-client0 libx11-xcb1 libxaw7 libxcb-dri2-0 libxcb-dri3-0 libxcb-glx0 libxcb-present0 libxcb-randr0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-xfixes0 libxcomposite1 libxfixes3 libxft2 libxi6 libxinerama1 libxkbfile1 libxmu6 libxpm4 libxrandr2 libxrender1 libxshmfence1 libxt6 libxtst6 libxv1 libxxf86dga1 libxxf86vm1 mesa-vulkan-drivers openjdk-11-jre-headless x11-common x11-utils Suggested packages: default-jre cups-common liblcms2-utils pcscd lm-sensors libnss-mdns fonts-ipafont-gothic fonts-ipafont-mincho fonts-wqy-microhei | fonts-wqy-zenhei fonts-indic mesa-utils The following NEW packages will be installed: at-spi2-core ca-certificates-java fontconfig-config fonts-dejavu-core fonts-dejavu-extra java-common libatk-bridge2.0-0 libatk-wrapper-java libatk-wrapper-java-jni libatk1.0-0 libatk1.0-data libatspi2.0-0 libavahi-client3 libavahi-common-data libavahi-common3 libcups2 libdrm-amdgpu1 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libfontconfig1 libfontenc1 libgif7 libgl1 libgl1-mesa-dri libglapi-mesa libglvnd0 libglx-mesa0 libglx0 libgraphite2-3 libharfbuzz0b libice6 libjpeg-turbo8 libjpeg8 liblcms2-2 libllvm12 libpciaccess0 libpcsclite1 libsensors-config libsensors5 libsm6 libvulkan1 libwayland-client0 libx11-xcb1 libxaw7 libxcb-dri2-0 libxcb-dri3-0 libxcb-glx0 libxcb-present0 libxcb-randr0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-xfixes0 libxcomposite1 libxfixes3 libxft2 libxi6 libxinerama1 libxkbfile1 libxmu6 libxpm4 libxrandr2 libxrender1 libxshmfence1 libxt6 libxtst6 libxv1 libxxf86dga1 libxxf86vm1 mesa-vulkan-drivers openjdk-11-jre openjdk-11-jre-headless x11-common x11-utils 0 upgraded, 75 newly installed, 0 to remove and 150 not upgraded. Need to get 79.3 MB of archives. After this operation, 715 MB of additional disk space will be used. Do you want to continue? [Y/n] Y Get:1 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libatspi2.0-0 amd64 2.36.0-2 [64.2 kB] Get:2 http://in.archive.ubuntu.com/ubuntu focal/main amd64 x11-common all 1:7.7+19ubuntu14 [22.3 kB] Get:3 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libxtst6 amd64 2:1.2.3-1 [12.8 kB] Get:4 http://in.archive.ubuntu.com/ubuntu focal/main amd64 at-spi2-core amd64 2.36.0-2 [48.7 kB] Get:5 http://in.archive.ubuntu.com/ubuntu focal/main amd64 java-common all 0.72 [6,816 B] Get:6 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 libavahi-common-data amd64 0.7-4ubuntu7.1 [21.4 kB] Get:7 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 libavahi-common3 amd64 0.7-4ubuntu7.1 [21.7 kB] Get:8 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 libavahi-client3 amd64 0.7-4ubuntu7.1 [25.5 kB] Get:9 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 libcups2 amd64 2.3.1-9ubuntu1.1 [233 kB] Get:10 http://in.archive.ubuntu.com/ubuntu focal/main amd64 liblcms2-2 amd64 2.9-4 [140 kB] Get:11 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 libjpeg-turbo8 amd64 2.0.3-0ubuntu1.20.04.1 [117 kB] Get:12 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libjpeg8 amd64 8c-2ubuntu8 [2,194 B] Get:13 http://in.archive.ubuntu.com/ubuntu focal/main amd64 fonts-dejavu-core all 2.37-1 [1,041 kB] Get:14 http://in.archive.ubuntu.com/ubuntu focal/main amd64 fontconfig-config all 2.13.1-2ubuntu3 [28.8 kB] Get:15 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libfontconfig1 amd64 2.13.1-2ubuntu3 [114 kB] Get:16 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libgraphite2-3 amd64 1.3.13-11build1 [73.5 kB] Get:17 http://in.archive.ubuntu.com/ubuntu focal/main amd64 libharfbuzz0b amd64 2.6.4-1ubuntu4 [391 kB] Get:18 http://in.archive.ubuntu * * Get:74 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 mesa-vulkan-drivers amd64 21.2.6-0ubuntu0.1~20.04.2 [5,788 kB] Get:75 http://in.archive.ubuntu.com/ubuntu focal-updates/main amd64 openjdk-11-jre amd64 11.0.15+10-0ubuntu0.20.04.1 [175 kB] Fetched 79.3 MB in 4s (19.1 MB/s) Extracting templates from packages: 100% Selecting previously unselected package libatspi2.0-0:amd64. (Reading database ... 71625 files and directories currently installed.) Preparing to unpack .../00-libatspi2.0-0_2.36.0-2_amd64.deb ... Unpacking libatspi2.0-0:amd64 (2.36.0-2) ... * * * Unpacking mesa-vulkan-drivers:amd64 (21.2.6-0ubuntu0.1~20.04.2) ... Selecting previously unselected package openjdk-11-jre:amd64. Preparing to unpack .../74-openjdk-11-jre_11.0.15+10-0ubuntu0.20.04.1_amd64.deb ... Unpacking openjdk-11-jre:amd64 (11.0.15+10-0ubuntu0.20.04.1) ... Setting up libgraphite2-3:amd64 (1.3.13-11build1) ... Setting up libxcb-dri3-0:amd64 (1.14-2) ... Setting up liblcms2-2:amd64 (2.9-4) ... Setting up libx11-xcb1:amd64 (2:1.6.9-2ubuntu1.2) ... Setting up libpciaccess0:amd64 (0.16-0ubuntu1) ... Setting up libdrm-nouveau2:amd64 (2.4.107-8ubuntu1~20.04.2) ... Setting up libxcb-xfixes0:amd64 (1.14-2) ... Setting up libxpm4:amd64 (1:3.5.12-1) ... Setting up libxi6:amd64 (2:1.7.10-0ubuntu1) ... Setting up java-common (0.72) ... * * * Setting up libxaw7:amd64 (2:1.0.13-1) ... Setting up x11-utils (7.7+5) ... Setting up libatk-wrapper-java (0.37.1-1) ... Setting up libatk-wrapper-java-jni:amd64 (0.37.1-1) ... Setting up openjdk-11-jre-headless:amd64 (11.0.15+10-0ubuntu0.20.04.1) ... update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/java to provide /usr/bin/java (java) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/jjs to provide /usr/bin/jjs (jjs) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/keytool to provide /usr/bin/keytool (keytool) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/rmid to provide /usr/bin/rmid (rmid) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/rmiregistry to provide /usr/bin/rmiregistry (rmiregistry) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/pack200 to provide /usr/bin/pack200 (pack200) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/bin/unpack200 to provide /usr/bin/unpack200 (unpack200) in auto mode update-alternatives: using /usr/lib/jvm/java-11-openjdk-amd64/lib/jexec to provide /usr/bin/jexec (jexec) in auto mode Setting up openjdk-11-jre:amd64 (11.0.15+10-0ubuntu0.20.04.1) ... Setting up ca-certificates-java (20190405ubuntu1) ... head: cannot open '/etc/ssl/certs/java/cacerts' for reading: No such file or directory Adding debian:QuoVadis_Root_CA_3_G3.pem Adding debian:Buypass_Class_2_Root_CA.pem Adding debian:Amazon_Root_CA_2.pem Adding debian:Sonera_Class_2_Root_CA.pem Adding debian:Atos_TrustedRoot_2011.pem Adding debian:Certigna_Root_CA.pem * * * Adding debian:Staat_der_Nederlanden_Root_CA_-_G3.pem Adding debian:ISRG_Root_X1.pem Adding debian:OISTE_WISeKey_Global_Root_GB_CA.pem Adding debian:e-Szigno_Root_CA_2017.pem Adding debian:UCA_Global_G2_Root.pem Adding debian:IdenTrust_Commercial_Root_CA_1.pem Adding debian:GlobalSign_ECC_Root_CA_-_R5.pem Adding debian:XRamp_Global_CA_Root.pem Adding debian:IdenTrust_Public_Sector_Root_CA_1.pem Adding debian:Hellenic_Academic_and_Research_Institutions_RootCA_2011.pem Adding debian:GlobalSign_Root_CA_-_R3.pem Adding debian:emSign_Root_CA_-_C1.pem Adding debian:certSIGN_Root_CA_G2.pem Adding debian:Amazon_Root_CA_4.pem done. Processing triggers for mime-support (3.64ubuntu1) ... Processing triggers for libc-bin (2.31-0ubuntu9.2) ... Processing triggers for systemd (245.4-4ubuntu3.15) ... Processing triggers for man-db (2.9.1-1) ... Processing triggers for ca-certificates (20210119~20.04.2) ... Updating certificates in /etc/ssl/certs... 0 added, 0 removed; done. Running hooks in /etc/ca-certificates/update.d... done. done. root@jenkins:~# Set Repository curl -fsSL https://pkg.jenkins.io/debian-stable/jenkins.io.key | sudo tee \ /usr/share/keyrings/jenkins-keyring.asc > /dev/null echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] \ https://pkg.jenkins.io/debian-stable binary/ | sudo tee \ /etc/apt/sources.list.d/jenkins.list > /dev/null Update Repository again sudo apt-get update Install Jenkins root@jenkins:~# sudo apt-get install jenkins Reading package lists... Done Building dependency tree Reading state information... Done The following package was automatically installed and is no longer required: libfwupdplugin1 Use 'sudo apt autoremove' to remove it. The following additional packages will be installed: net-tools The following NEW packages will be installed: jenkins net-tools 0 upgraded, 2 newly installed, 0 to remove and 0 not upgraded. Need to get 87.9 MB of archives. After this operation, 92.1 MB of additional disk space will be used. Do you want to continue? [Y/n] Y Get:1 http://in.archive.ubuntu.com/ubuntu focal/main amd64 net-tools amd64 1.60+git20180626.aebd88e-1ubuntu1 [196 kB] Get:2 https://pkg.jenkins.io/debian-stable binary/ jenkins 2.346.1 [87.7 MB] Fetched 87.9 MB in 44s (1,978 kB/s) Selecting previously unselected package net-tools. (Reading database ... 109455 files and directories currently installed.) Preparing to unpack .../net-tools_1.60+git20180626.aebd88e-1ubuntu1_amd64.deb ... Unpacking net-tools (1.60+git20180626.aebd88e-1ubuntu1) ... Selecting previously unselected package jenkins. Preparing to unpack .../jenkins_2.346.1_all.deb ... Unpacking jenkins (2.346.1) ... Setting up net-tools (1.60+git20180626.aebd88e-1ubuntu1) ... Setting up jenkins (2.346.1) ... Created symlink /etc/systemd/system/multi-user.target.wants/jenkins.service → /lib/systemd/system/jenkins.service. Processing triggers for man-db (2.9.1-1) ... Processing triggers for systemd (245.4-4ubuntu3.17) ... root@jenkins:~# Jenkins is now successfully installed To access Jenkins , I will now go ahead and enter this url on the browser http://jenkins:8080 Get the password as suggested cat /var/lib/jenkins/secrets/initialAdminPassword Copy the password and enter in the section on the browser Once we click on continue after entering password , you would be presented with a pane where you can install plugins Click on "Install suggested plugins" to start installation Once all of the plugins are installed , it would then prompt to create a user Once we enter the first admin user information , then when you click on save and continue , it would present you with the url needed for you to login into Jenkins. Now click on Save and Finish Jenkins is now ready Now , when we click on start using jenkins you will bv

  • Scaling Down vRealize Automation 7.x

    Use Case With vRealize Automation 7.x reaching it's end of like on September 2022 , most of the customers either adopted version 8.x or are in transition and will get there eventually. It's not that easy to stop an enterprise application and decommission it overnight , but it's possible to scale it down rather than keep it distributed and highly available Keeping this in mind i thought i'll pen down few steps on how to scale down vRA 7.x Environment Built a 3 node vRA appliance and a 2 node IAAS servers and called them as below Procedure Take Snapshots before performing any of the below steps across all nodes. backup databases too Output of listing all nodes in a cluster looks like below in my lab Node: NodeHost: svraone.cap.org NodeId: cafe.node.631087009.16410 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: True Component: Type: vRO Version: 7.6.0.12923317 Node: NodeHost: siaastwo.cap.org NodeId: 7DD5F70C-976F-4635-89F8-582986851E98 NodeType: IAAS Components: Component: Type: Website Version: 7.6.0.16195 State: Started Component: Type: ModelManagerWeb Version: 7.6.0.16195 State: Started Component: Type: ManagerService Version: 7.6.0.16195 State: Active Component: Type: ManagementAgent Version: 7.6.0.17541 State: Started Component: Type: DemWorker Version: 7.6.0.16195 State: Started Component: Type: DemOrchestrator Version: 7.6.0.16195 State: Started Component: Type: WAPI Version: 7.6.0.16195 State: Started Component: Type: vSphereAgent Version: 7.6.0.16195 State: Started Node: NodeHost: siaasone.cap.org NodeId: B030EDF7-DB2C-4830-942A-F40D9464AAD9 NodeType: IAAS Components: Component: Type: Database Version: 7.6.0.16195 State: Available Component: Type: Website Version: 7.6.0.16195 State: Started Component: Type: ModelManagerData Version: 7.6.0.16195 State: Available Component: Type: ModelManagerWeb Version: 7.6.0.16195 State: Started Component: Type: ManagerService Version: 7.6.0.16195 State: Passive Component: Type: ManagementAgent Version: 7.6.0.17541 State: Started Component: Type: DemOrchestrator Version: 7.6.0.16195 State: Started Component: Type: DemWorker Version: 7.6.0.16195 State: Started Component: Type: WAPI Version: 7.6.0.16195 State: Started Component: Type: vSphereAgent Version: 7.6.0.16195 State: Started Node: NodeHost: svrathree.cap.org NodeId: cafe.node.384204123.10666 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: False Component: Type: vRO Version: 7.6.0.12923317 Node: NodeHost: svratwo.cap.org NodeId: cafe.node.776067309.27389 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: False Component: Type: vRO Version: 7.6.0.12923317 To scale down , i'd like to remove my secondary nodes and just leave primary in my cluster I'll begin my scaling down approach with IAAS nodes , that's siaastwo.cap.org I'll open VAMI of my Master Node and then click on the cluster tab Because we powered off second iaas node , it won't show in connected state The moment i click on "Delete" next to the IAAS secondary node, I'll get a warning shown as below svraone:5480 says Do you really want to delete the node 7DD5F70C-976F-4635-89F8-582986851E98 which was last connected 11 minutes ago? You will need to remove its hostname from an external load balancer! This ID: 7DD5F70C-976F-4635-89F8-582986851E98 belongs to siaastwo.cap.org , see the output below Node: NodeHost: siaastwo.cap.org NodeId: 7DD5F70C-976F-4635-89F8-582986851E98 NodeType: IAAS Components: Component: Type: Website Version: 7.6.0.16195 State: Started Component: Type: ModelManagerWeb Version: 7.6.0.16195 State: Started Component: Type: ManagerService Version: 7.6.0.16195 State: Active Component: Type: ManagementAgent Version: 7.6.0.17541 State: Started Component: Type: DemWorker Version: 7.6.0.16195 State: Started Component: Type: DemOrchestrator Version: 7.6.0.16195 State: Started Component: Type: WAPI Version: 7.6.0.16195 State: Started Component: Type: vSphereAgent Version: 7.6.0.16195 State: Started Now confirm deletion The node is now successfully removed To monitor one can take a look at /var/log/messages 2022-07-05T23:12:47.929846+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Logging event node-removed 2022-07-05T23:12:47.929877+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/05-db-sync 2022-07-05T23:12:47.930565+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9692]: info Resolved vCAC host: svraone.cap.org 2022-07-05T23:12:48.005902+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/05-db-sync: IS_MASTER: 'True', NODES: 'svraone.cap.org svrathree.cap.org svratwo.cap.org' 2022-07-05T23:12:48.039511+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 05-db-sync is 2022-07-05T23:12:48.039556+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/10-rabbitmq 2022-07-05T23:12:48.125189+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/10-rabbitmq: REMOVED_NODE: 'siaastwo.cap.org', hostname: 'svraone.cap.org' 2022-07-05T23:12:48.130809+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 10-rabbitmq is 2022-07-05T23:12:48.130832+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/20-haproxy 2022-07-05T23:12:48.233369+00:00 svraone node-removed: Removing 'siaastwo.cap.org' from haproxy config 2022-07-05T23:12:48.265237+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info Jul 05, 2022 11:12:48 PM org.springframework.jdbc.datasource.SingleConnectionDataSource initConnection 2022-07-05T23:12:48.265459+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info INFO: Established shared JDBC Connection: org.postgresql.jdbc.PgConnection@6ab7a896 2022-07-05T23:12:48.308782+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 20-haproxy is Loaded HAProxy configuration file: /etc/haproxy/conf.d/30-vro-config.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/20-vcac.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/40-xenon.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/10-psql.cfg Reload service haproxy ..done 2022-07-05T23:12:48.308807+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/25-db 2022-07-05T23:12:48.353287+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info [2022-07-05 23:12:48] [root] [INFO] Current node in cluster mode 2022-07-05T23:12:48.353314+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info command exit code: 1 2022-07-05T23:12:48.353322+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info cluster-mode-check [2022-07-05 23:12:48] [root] [INFO] Current node in cluster mode 2022-07-05T23:12:48.354087+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info Executing shell command... 2022-07-05T23:12:48.458204+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/25-db: REMOVED_NODE: 'siaastwo.cap.org', hostname: 'svraone.cap.org' 2022-07-05T23:12:48.461776+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 25-db is 2022-07-05T23:12:48.461800+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/30-vidm-db 2022-07-05T23:12:48.827039+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/30-vidm-db: IS_MASTER: 'True', REMOVED_NODE: 'siaastwo.cap.org' 2022-07-05T23:12:48.847537+00:00 svraone node-removed: Removing 'siaastwo' from horizon database tables 2022-07-05T23:12:48.852777+00:00 svraone su: (to postgres) root on none 2022-07-05T23:12:50.279007+00:00 svraone su: last message repeated 3 times 2022-07-05T23:12:50.278863+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 30-vidm-db is DELETE 0 DELETE 0 Last login: Tue Jul 5 23:12:48 UTC 2022 DELETE 0 Last login: Tue Jul 5 23:12:49 UTC 2022 DELETE 0 Last login: Tue Jul 5 23:12:49 UTC 2022 2022-07-05T23:12:50.278889+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master 2022-07-05T23:12:50.370747+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master: IS_MASTER: 'True', REMOVED_NODE: 'siaastwo.cap.org' 2022-07-05T23:12:50.383950+00:00 svraone node-removed: Removing 'rabbit@siaastwo' from rabbitmq cluster 2022-07-05T23:12:50.424780+00:00 svraone su: (to rabbitmq) root on none 2022-07-05T23:12:50.476667+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9692]: info Event request for siaastwo.cap.org timed out 2022-07-05T23:12:51.335973+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[9693]: info Executing shell command... 2022-07-05T23:12:52.455441+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 40-rabbitmq-master is Removing node rabbit@siaastwo from cluster 2022-07-05T23:12:52.455467+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/50-elasticsearch 2022-07-05T23:12:52.550899+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/50-elasticsearch: IS_MASTER: 'True' 2022-07-05T23:12:52.564092+00:00 svraone node-removed: Restarting elasticsearch service 2022-07-05T23:12:52.733672+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Result from 50-elasticsearch is Stopping elasticsearch: process in pidfile `/opt/vmware/elasticsearch/elasticsearch.pid'done. Starting elasticsearch: 2048 2022-07-05T23:12:52.733690+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[9919]: info Executing /etc/vr/cluster-event/node-removed.d/60-vidm-health 2022-07-05T23:12:52.883943+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/60-vidm-health: IS_MASTER: 'True', REMOVED_NODE: 'siaastwo.cap.org' Executing list nodes command you can now see there are no references to siaastwo.cap.org Node: NodeHost: svraone.cap.org NodeId: cafe.node.631087009.16410 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: True Component: Type: vRO Version: 7.6.0.12923317 Node: NodeHost: siaasone.cap.org NodeId: B030EDF7-DB2C-4830-942A-F40D9464AAD9 NodeType: IAAS Components: Component: Type: Database Version: 7.6.0.16195 State: Available Component: Type: Website Version: 7.6.0.16195 State: Started Component: Type: ModelManagerData Version: 7.6.0.16195 State: Available Component: Type: ModelManagerWeb Version: 7.6.0.16195 State: Started Component: Type: ManagerService Version: 7.6.0.16195 State: Active Component: Type: ManagementAgent Version: 7.6.0.17541 State: Started Component: Type: DemOrchestrator Version: 7.6.0.16195 State: Started Component: Type: DemWorker Version: 7.6.0.16195 State: Started Component: Type: WAPI Version: 7.6.0.16195 State: Started Component: Type: vSphereAgent Version: 7.6.0.16195 State: Started Node: NodeHost: svrathree.cap.org NodeId: cafe.node.384204123.10666 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: False Component: Type: vRO Version: 7.6.0.12923317 NodeHost: svratwo.cap.org NodeId: cafe.node.776067309.27389 NodeType: VA Components: Component: Type: vRA Version: 7.6.0.317 Primary: False Component: Type: vRO Version: 7.6.0.12923317 Now , let's move on to remove second and third appliance from the cluster Before i remove nodes from cluster, i'll remove connectors coming from those nodes Now that the connectors are removed, we will now move on with removing the vRA appliances from cluster Take one more round of snapshots Once the snapshot tasks are complete , we will proceed with appliance removal Remember , you cannot and should not remove the master from cluster. Ensure database is in Asynchronous mode Click on delete next to svrathree.cap.org to delete the node or remove it from the cluster While you remove node from cluster , you can check /var/log/messages or /var/log/vmware/vcac/vcac-config.log for more information 2022-07-06T00:11:53.985239+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Processing request PUT /vacluster/event/node-removed 2022-07-06T00:11:54.123523+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39336]: info Resolved vCAC host: svraone.cap.org 2022-07-06T00:11:54.255543+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Logging event node-removed 2022-07-06T00:11:54.255982+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/05-db-sync 2022-07-06T00:11:54.331159+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/05-db-sync: IS_MASTER: 'True', NODES: 'svraone.cap.org svratwo.cap.org' 2022-07-06T00:11:54.339000+00:00 svraone node-removed: Setting database to ASYNC 2022-07-06T00:11:54.898295+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info Jul 06, 2022 12:11:54 AM org.springframework.jdbc.datasource.SingleConnectionDataSource initConnection 2022-07-06T00:11:54.898572+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info INFO: Established shared JDBC Connection: org.postgresql.jdbc.PgConnection@6ab7a896 2022-07-06T00:11:54.961623+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info [2022-07-06 00:11:54] [root] [INFO] Current node in cluster mode 2022-07-06T00:11:54.961643+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info command exit code: 1 2022-07-06T00:11:54.961651+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info cluster-mode-check [2022-07-06 00:11:54] [root] [INFO] Current node in cluster mode 2022-07-06T00:11:54.961660+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info Executing shell command... 2022-07-06T00:11:56.423082+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 05-db-sync is 2022-07-06T00:11:56.423104+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/10-rabbitmq 2022-07-06T00:11:56.514913+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/10-rabbitmq: REMOVED_NODE: 'svrathree.cap.org', hostname: 'svraone.cap.org' 2022-07-06T00:11:56.518400+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 10-rabbitmq is 2022-07-06T00:11:56.518424+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/20-haproxy 2022-07-06T00:11:56.619648+00:00 svraone node-removed: Removing 'svrathree.cap.org' from haproxy config 2022-07-06T00:11:56.749906+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 20-haproxy is Loaded HAProxy configuration file: /etc/haproxy/conf.d/30-vro-config.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/20-vcac.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/40-xenon.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/10-psql.cfg Reload service haproxy ..done 2022-07-06T00:11:56.749930+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/25-db 2022-07-06T00:11:56.894988+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/25-db: REMOVED_NODE: 'svrathree.cap.org', hostname: 'svraone.cap.org' 2022-07-06T00:11:56.898755+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 25-db is 2022-07-06T00:11:56.898777+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/30-vidm-db 2022-07-06T00:11:56.982014+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info Executing shell command... 2022-07-06T00:11:56.986794+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/30-vidm-db: IS_MASTER: 'True', REMOVED_NODE: 'svrathree.cap.org' 2022-07-06T00:11:56.995099+00:00 svraone node-removed: Removing 'svrathree' from horizon database tables 2022-07-06T00:11:57.000134+00:00 svraone su: (to postgres) root on none 2022-07-06T00:11:57.519218+00:00 svraone su: last message repeated 3 times 2022-07-06T00:11:57.519100+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 30-vidm-db is DELETE 1 DELETE 1 Last login: Wed Jul 6 00:11:56 UTC 2022 DELETE 0 Last login: Wed Jul 6 00:11:57 UTC 2022 DELETE 1 Last login: Wed Jul 6 00:11:57 UTC 2022 2022-07-06T00:11:57.519126+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master 2022-07-06T00:11:57.602251+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master: IS_MASTER: 'True', REMOVED_NODE: 'svrathree.cap.org' 2022-07-06T00:11:57.611328+00:00 svraone node-removed: Removing 'rabbit@svrathree' from rabbitmq cluster 2022-07-06T00:11:57.649937+00:00 svraone su: (to rabbitmq) root on none 2022-07-06T00:11:59.704582+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 40-rabbitmq-master is Removing node rabbit@svrathree from cluster 2022-07-06T00:11:59.704608+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/50-elasticsearch 2022-07-06T00:11:59.785378+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/50-elasticsearch: IS_MASTER: 'True' 2022-07-06T00:11:59.790096+00:00 svraone node-removed: Restarting elasticsearch service 2022-07-06T00:11:59.964638+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[39337]: info Executing shell command... 2022-07-06T00:11:59.987551+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Result from 50-elasticsearch is Stopping elasticsearch: process in pidfile `/opt/vmware/elasticsearch/elasticsearch.pid'done. Starting elasticsearch: 2048 2022-07-06T00:11:59.987575+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[39546]: info Executing /etc/vr/cluster-event/node-removed.d/60-vidm-health 2022-07-06T00:12:00.110181+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/60-vidm-health: IS_MASTER: 'True', REMOVED_NODE: 'svrathree.cap.org' Rabbitmq cluster status returns only 2 nodes now [master] svraone:~ # rabbitmqctl cluster_status Cluster status of node rabbit@svraone [{nodes,[{disc,[rabbit@svraone,rabbit@svratwo]}]}, {running_nodes,[rabbit@svratwo,rabbit@svraone]}, {cluster_name,<<"rabbit@svraone.cap.org">>}, {partitions,[]}, {alarms,[{rabbit@svratwo,[]},{rabbit@svraone,[]}]}] Do the same with the second node that's svratwo.cap.org 2022-07-06T00:28:23.511761+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Processing request PUT /vacluster/event/node-removed 2022-07-06T00:28:23.584626+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6987]: info Resolved vCAC host: svraone.cap.org 2022-07-06T00:28:23.787863+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Logging event node-removed 2022-07-06T00:28:23.787888+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/05-db-sync 2022-07-06T00:28:23.861100+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/05-db-sync: IS_MASTER: 'True', NODES: 'svraone.cap.org' 2022-07-06T00:28:23.869175+00:00 svraone node-removed: Setting database to ASYNC 2022-07-06T00:28:23.875065+00:00 svraone vami /opt/vmware/share/htdocs/service/cafe/config.py[7226]: info Processing request PUT /config/nodes/B030EDF7-DB2C-4830-942A-F40D9464AAD9/ping, referer: None 2022-07-06T00:28:23.955013+00:00 svraone vami /opt/vmware/share/htdocs/service/cafe/config.py[7226]: info Legacy authentication token received from ::ffff:AA.BBB.CC.DDD 2022-07-06T00:28:24.365470+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info Jul 06, 2022 12:28:24 AM org.springframework.jdbc.datasource.SingleConnectionDataSource initConnection 2022-07-06T00:28:24.365492+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info INFO: Established shared JDBC Connection: org.postgresql.jdbc.PgConnection@6ab7a896 2022-07-06T00:28:24.432795+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info [2022-07-06 00:28:24] [root] [INFO] Current node not in cluster mode 2022-07-06T00:28:24.432956+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info command exit code: 0 2022-07-06T00:28:24.433160+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info cluster-mode-check Current node not in cluster mode 2022-07-06T00:28:24.433520+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info Executing shell command... 2022-07-06T00:28:25.847315+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 05-db-sync is 2022-07-06T00:28:25.847337+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/10-rabbitmq 2022-07-06T00:28:25.925379+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/10-rabbitmq: REMOVED_NODE: 'svratwo.cap.org', hostname: 'svraone.cap.org' 2022-07-06T00:28:25.939461+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 10-rabbitmq is 2022-07-06T00:28:25.939482+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/20-haproxy 2022-07-06T00:28:26.015719+00:00 svraone node-removed: Removing 'svratwo.cap.org' from haproxy config 2022-07-06T00:28:26.116374+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 20-haproxy is Loaded HAProxy configuration file: /etc/haproxy/conf.d/30-vro-config.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/20-vcac.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/40-xenon.cfg Loaded HAProxy configuration file: /etc/haproxy/conf.d/10-psql.cfg Reload service haproxy ..done 2022-07-06T00:28:26.116394+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/25-db 2022-07-06T00:28:26.229282+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/25-db: REMOVED_NODE: 'svratwo.cap.org', hostname: 'svraone.cap.org' 2022-07-06T00:28:26.232359+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 25-db is 2022-07-06T00:28:26.232378+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/30-vidm-db 2022-07-06T00:28:26.303362+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/30-vidm-db: IS_MASTER: 'True', REMOVED_NODE: 'svratwo.cap.org' 2022-07-06T00:28:26.313158+00:00 svraone node-removed: Removing 'svratwo' from horizon database tables 2022-07-06T00:28:26.318183+00:00 svraone su: (to postgres) root on none 2022-07-06T00:28:26.809478+00:00 svraone su: last message repeated 3 times 2022-07-06T00:28:26.809378+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 30-vidm-db is DELETE 1 Last login: Wed Jul 6 00:11:57 UTC 2022 DELETE 1 Last login: Wed Jul 6 00:28:26 UTC 2022 DELETE 0 Last login: Wed Jul 6 00:28:26 UTC 2022 DELETE 1 Last login: Wed Jul 6 00:28:26 UTC 2022 2022-07-06T00:28:26.809402+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master 2022-07-06T00:28:26.904576+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/40-rabbitmq-master: IS_MASTER: 'True', REMOVED_NODE: 'svratwo.cap.org' 2022-07-06T00:28:26.917409+00:00 svraone node-removed: Removing 'rabbit@svratwo' from rabbitmq cluster 2022-07-06T00:28:26.971570+00:00 svraone su: (to rabbitmq) root on none 2022-07-06T00:28:28.544064+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 40-rabbitmq-master is Removing node rabbit@svratwo from cluster 2022-07-06T00:28:28.544087+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/50-elasticsearch 2022-07-06T00:28:28.578987+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster-config.py[6988]: info Executing shell command... 2022-07-06T00:28:28.620636+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/50-elasticsearch: IS_MASTER: 'True' 2022-07-06T00:28:28.625562+00:00 svraone node-removed: Restarting elasticsearch service 2022-07-06T00:28:28.811215+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Result from 50-elasticsearch is Stopping elasticsearch: process in pidfile `/opt/vmware/elasticsearch/elasticsearch.pid'done. Starting elasticsearch: 2048 2022-07-06T00:28:28.811240+00:00 svraone vami /opt/vmware/share/htdocs/service/cluster/cluster.py[7185]: info Executing /etc/vr/cluster-event/node-removed.d/60-vidm-health 2022-07-06T00:28:29.042125+00:00 svraone node-removed: /etc/vr/cluster-event/node-removed.d/60-vidm-health: IS_MASTER: 'True', REMOVED_NODE: 'svratwo.cap.org' Here's the final status Even though svratwo was out of cluster , it was still showing up under rabbitmq cluster status Perform a "Reset Rabbitmq" to get rid of this stale node Once completed , it should all be good [master] svraone:~ # rabbitmqctl cluster_status Cluster status of node rabbit@svraone [{nodes,[{disc,[rabbit@svraone]}]}, {running_nodes,[rabbit@svraone]}, {cluster_name,<<"rabbit@svraone.cap.org">>}, {partitions,[]}, {alarms,[{rabbit@svraone,[]}]}] Login into vRA portal and see if everything is functional Perform a health check on IAAS nodes and endpoints as well , generic health check Also , check if deployments are working if your still using this 7.x version This concludes the blog , before making any changes take snapshots. Once all verifications are completed , go ahead and turn off the other nodes we removed and delete it after a week or so based on your environment

  • Multi Node SaltStack Installation ( flow chart )

    There are 2 methods to install SaltStack Config Standard method vRealize Suite Lifecycle Manager method Second one is pretty straight forward. If we want to have a Multi-Node SaltStack then we need to follow standard method by installing Salt components on multiple nodes Above flow chart explains detailed procedure on how one installs a multi node SaltStack Config Note: There are certain design decisions which you have to make according to your requirement. But the overall procedure would remain same If your unable to view the image properly, then download the high resolution jpg file by downloading and extracting this zip file.

  • Validation of SaltStack endpoint fails when Running Environment "embedded-ABX-onprem" is chosen

    Problem Statement On a greenfield installation of vRA 8.7 which is integrated with SaltStack Config , we have an option to choose "Running Environment" This screenshot below explains how a default SaltStack config deployed through vRSLCM is presented under infrastructure tab When you enter the password for root , without selecting any option under "Running Environment" , it successfully validates But, the moment you select "Running Environment" we see an exception where the validation fails before we see the exception , there is an abx integration run which occurs which gives you more details about the exception Running in polyglot! [2022-04-06 13:56:22,183] [INFO] - [saltstack-integration] Validating Salt Stack Config Server credentials... [2022-04-06 13:56:22,183] [INFO] - [saltstack-integration] Authenticating to a Salt Stack Config Server with url [https://ss.cap.org//account/login]... [2022-04-06 13:56:22,184] [INFO] - [saltstack-integration] Retrieving credentials from auth credentials link at [/core/auth/credentials/f0c26468-4c1b-4a62-b33a-b04d7c03390e]... [2022-04-06 13:56:22,304] [INFO] - [saltstack-integration] Successfully retrieved credentials from auth credentials link [2022-04-06 13:56:22,304] [INFO] - [saltstack-integration] Retrieving Salt Stack Config Server XSRF token from url [https://ss.cap.org//account/login]... /run/abx-polyglot/function/urllib3/connectionpool.py:1050: InsecureRequestWarning: Unverified HTTPS request is being made to host 'ss.cap.org'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings InsecureRequestWarning, [2022-04-06 13:56:22,327] [ERROR] - [saltstack-integration] Failed to validate Salt Stack Config Server credentials: Failed to authenticate to a Salt Stack Config Server: Failed to retrieve Salt Stack Config Server XSRF token: 403 Client Error: Forbidden for url: https://ss.cap.org//account/login Finished running action code. Exiting python process. Python process exited. Max Memory Used: 22 MB The reason for the exception is as below Failed to validate Salt Stack Config Server credentials: Failed to authenticate to a Salt Stack Config Server: Failed to retrieve Salt Stack Config Server XSRF token: 403 Client Error: Forbidden for url: https://ss.cap.org//account/login exception under provisioning-service-app.log 2022-04-06T16:37:34.118Z WARN provisioning [host='provisioning-service-app-6885766867-kgk4l' thread='reactor-http-epoll-10' user='provisioning-RVgAJFw9LrOYkeUr(arun)' org='c2eae67a-ff6d-4dae-9fd3-6594352a1f8a' trace='dc45aa9b-4b4e-47d3-8176-8321b1a2336a' parent='4a10178e-bf5f-48d0-8928-ae7a84e3aff4' span='d1977a94-1448-45a5-b93a-e449c8a76b60'] c.v.xenon.common.ServiceErrorResponse.create:85 - message: Failed to authenticate, please check your credentials or if the host is reachable, statusCode: 400, serverErrorId: 9c245260-075a-4dc0-bbe2-fb13b0e5d0bd: Caused by java.lang.RuntimeException: Failed to authenticate, please check your credentials or if the host is reachable at com.vmware.xenon.common.SpringHostUtils.responseEntityToOperation(SpringHostUtils.java:952) at com.vmware.xenon.common.SpringHostUtils.lambda$sendRequest$4(SpringHostUtils.java:289) at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) at reactor.core.publisher.MonoToCompletableFuture.onNext(MonoToCompletableFuture.java:64) at reactor.core.publisher.FluxOnAssembly$OnAssemblySubscriber.onNext(FluxOnAssembly.java:539) at io.opentracing.contrib.reactor.TracedSubscriber.lambda$onNext$2(TracedSubscriber.java:69) at io.opentracing.contrib.reactor.TracedSubscriber.withActiveSpan(TracedSubscriber.java:95) at io.opentracing.contrib.reactor.TracedSubscriber.onNext(TracedSubscriber.java:69) at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127) at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) at io.opentracing.contrib.reactor.TracedSubscriber.lambda$onNext$2(TracedSubscriber.java:69) at io.opentracing.contrib.reactor.TracedSubscriber.withActiveSpan(TracedSubscriber.java:95) at io.opentracing.contrib.reactor.TracedSubscriber.onNext(TracedSubscriber.java:69) at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127) * * * * * at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) Remediation Method 1 (Greenfeild Scenario ) If you have a SaltStack which was recently deployed and doesn't have any resources mapped to this integration , then simply delete the integration and recreate it Before After See the difference , ensure when your adding the hostname in the integration When vRSLCM add SaltStack integration , it uses URL ( https://<>/ ) which is when you would see the problem Once you remove the integration and then add it back again with just FQDN and not the URL of the SaltStack server , then we select an Running environment , it all works fine. Method 2 ( Brownfield Scenario ) When you have resources being managed by SaltStack Integration information is stored inside provisioning-db of vRealize Automation environment To login into database use vracli dev psql Accept the warning that it's a developer command and ensure you know what you are changing Below is the screenshot and output of the table where the integration information is stored. The table is known as endpoint_state , this is present inside provisioning-db To connect to provisioning-db use the below command \c provisioning-db root@vra [ ~ ]# vracli dev psql This execution will be recorded! 'psql' is a developer command. Type 'yes' if you want to continue, or 'no' to stop: yes 2022-04-06 14:14:43,439 [INFO] Logging into database template1 psql (10.18) Type "help" for help. template1=# \c provisioning-db You are now connected to database "provisioning-db" as user "postgres". provisioning-db=# \x Expanded display is on. provisioning-db=# select * from endpoint_state where name = 'vssc_idm'; -[ RECORD 1 ]-------------------+--------------------------------------------------------------------------------------------------------------------- document_self_link | /resources/endpoints/b2b02510-b0d5-46cf-9248-570b3d1bd58d document_auth_principal_link | /provisioning/auth/csp/users/cgs-lecvl28lpzqwhozt@provisioning-client.local document_expiration_time_micros | 0 document_owner | document_update_action | PATCH document_update_time_micros | 1646141393852000 document_version | 1 id | 6c2679af-a23c-4c88-8af0-3380305e3cde name | vssc_idm c_desc | custom_properties | {"hostName": "https://ss.cap.org/", "isExternal": "true", "privateKeyId": "root"} tenant_links | ["/tenants/organization/c2eae67a-ff6d-4dae-9fd3-6594352a1f8a", "/tenants/project/1f32c781c7bac475-7f703c5265a63d87"] group_links | tag_links | org_auth_link | /tenants/organization/c2eae67a-ff6d-4dae-9fd3-6594352a1f8a project_auth_link | owner_auth_link | msp_auth_link | creation_time_micros | region_id | endpoint_links | compute_host_link | /resources/compute/c6f3a8ac-c700-41b2-a91d-91a3fdd73765 expanded_tags | document_creation_time_micros | 1646141393802000 endpoint_type | saltstack auth_credentials_link | /core/auth/credentials/eeb389af-fcd6-4b06-a0e9-5d178f128eed compute_link | /resources/compute/c6f3a8ac-c700-41b2-a91d-91a3fdd73765 compute_description_link | /resources/compute-descriptions/8a689d42-24f7-4f6d-b362-e85b6dc6f423 resource_pool_link | /resources/pools/1f32c781c7bac475-7f703c5265a63d87 parent_link | associated_endpoint_links | endpoint_properties | {"hostName": "https://ss.cap.org/", "privateKeyId": "root"} maintenance_mode | mobility_endpoint_links | provisioning-db=# Look at the custom_properties section , this is how it is out of the box custom_properties | {"hostName": "https://ss.cap.org/", "isExternal": "true", "privateKeyId": "root"} We would add an additional property called dcID and change the hostname to FQDN than a URL and keep endpointId blank. update endpoint_state set custom_properties = '{"dcId": "onprem", "hostName": "ss.cap.org", "endpointId": "", "isExternal": "true", "privateKeyId": "root"}' where name = 'vssc_idm'; Along with it , we would have to change endpoint_properties too. This has to reflect FQDN than the whole url endpoint_properties | {"hostName": "https://ss.cap.org/", "privateKeyId": "root"} Note : Before making changes i'll take a snapshot of vRA appliance As we already logged into the database before , let's go ahead and make the change. Execute below query and ensure its successful update endpoint_state set custom_properties = '{"dcId": "onprem", "hostName": "ss.cap.org", "endpointId": "", "isExternal": "true", "privateKeyId": "root"}' where name = 'vssc_idm'; update endpoint_state set endpoint_properties = '{"hostName": "ss.cap.org", "privateKeyId": "root"}' where name = 'vssc_idm'; provisioning-db=# update endpoint_state set custom_properties = '{"dcId": "onprem", "hostName": "ss.cap.org", "endpointId": "", "isExternal": "true", "privateKeyId": "root"}' where name = 'vssc_idm'; UPDATE 1 provisioning-db=# update endpoint_state set endpoint_properties = ' {"hostName": "ss.cap.org", "privateKeyId": "root"}' where name = 'vssc_idm'; UPDATE 1 provisioning-db=# select * from endpoint_state where name = 'vssc_idm'; -[ RECORD 1 ]-------------------+--------------------------------------------------------------------------------------------------------------------- document_self_link | /resources/endpoints/b2b02510-b0d5-46cf-9248-570b3d1bd58d document_auth_principal_link | /provisioning/auth/csp/users/cgs-lecvl28lpzqwhozt@provisioning-client.local document_expiration_time_micros | 0 document_owner | document_update_action | PATCH document_update_time_micros | 1646141393852000 document_version | 1 id | 6c2679af-a23c-4c88-8af0-3380305e3cde name | vssc_idm c_desc | custom_properties | {"dcId": "onprem", "hostName": "ss.cap.org", "endpointId": "", "isExternal": "true", "privateKeyId": "root"} tenant_links | ["/tenants/organization/c2eae67a-ff6d-4dae-9fd3-6594352a1f8a", "/tenants/project/1f32c781c7bac475-7f703c5265a63d87"] group_links | tag_links | org_auth_link | /tenants/organization/c2eae67a-ff6d-4dae-9fd3-6594352a1f8a project_auth_link | owner_auth_link | msp_auth_link | creation_time_micros | region_id | endpoint_links | compute_host_link | /resources/compute/c6f3a8ac-c700-41b2-a91d-91a3fdd73765 expanded_tags | document_creation_time_micros | 1646141393802000 endpoint_type | saltstack auth_credentials_link | /core/auth/credentials/eeb389af-fcd6-4b06-a0e9-5d178f128eed compute_link | /resources/compute/c6f3a8ac-c700-41b2-a91d-91a3fdd73765 compute_description_link | /resources/compute-descriptions/8a689d42-24f7-4f6d-b362-e85b6dc6f423 resource_pool_link | /resources/pools/1f32c781c7bac475-7f703c5265a63d87 parent_link | associated_endpoint_links | endpoint_properties | {"hostName": "ss.cap.org", "privateKeyId": "root"} maintenance_mode | mobility_endpoint_links | As one can see from the above update , we did change the custom_properties of the SSC integration in vRA Exit the database by executing \q Now let's reboot saltstack , log out of vRA and log back in again . See if the FQDN is back in the hostname rather than the URL. If that's the case it would successfully authenticate with the "Running Environment " in place Running in polyglot! [2022-04-06 17:13:36,475] [INFO] - [saltstack-integration] Validating Salt Stack Config Server credentials... [2022-04-06 17:13:36,475] [INFO] - [saltstack-integration] Authenticating to a Salt Stack Config Server with url [https://ss.cap.org/account/login]... [2022-04-06 17:13:36,475] [INFO] - [saltstack-integration] Retrieving credentials from auth credentials link at [/core/auth/credentials/d7ea970e-cdca-42bc-b53d-ddac713a8666]... [2022-04-06 17:13:36,519] [INFO] - [saltstack-integration] Successfully retrieved credentials from auth credentials link [2022-04-06 17:13:36,519] [INFO] - [saltstack-integration] Retrieving Salt Stack Config Server XSRF token from url [https://ss.cap.org/account/login]... /run/abx-polyglot/function/urllib3/connectionpool.py:1050: InsecureRequestWarning: Unverified HTTPS request is being made to host 'ss.cap.org'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings InsecureRequestWarning, [2022-04-06 17:13:36,544] [INFO] - [saltstack-integration] Successfully retrieved Salt Stack Config Server XSRF token /run/abx-polyglot/function/urllib3/connectionpool.py:1050: InsecureRequestWarning: Unverified HTTPS request is being made to host 'ss.cap.org'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings InsecureRequestWarning, [2022-04-06 17:13:36,633] [INFO] - [saltstack-integration] Successfully authenticated to a Salt Stack Config Server [2022-04-06 17:13:36,634] [INFO] - [saltstack-integration] Successfully validated Salt Stack Config Server credentials Finished running action code. Exiting python process. Python process exited. Max Memory Used: 21 MB In Short Issue is seen due to the fact there is a URL rather than FQDN and when it's trying to execute an API to authentication it get's a 403 error Unless we fix this issue you will not be able to successfully validate running environment If it's a new environment with no SaltStack resources , go ahead and delete the integration and re-create it If it's an existing integration with resources in place , then modify the database as shown above 1. connect to postgres database vracli dev psql 2. connect to provisiioning-db \c provisioning-db 3. enable expanded display \x 4. Update custom_properties value where the hostname is set to URL of SaltStack node than an FQDN. Remember to change in endpoint_state table as shown below. Sometimes the name of the integration might be different if it's changed from UI. So change it accordingly. update endpoint_state set custom_properties = '{"dcId": "onprem", "hostName": "FQDN-SALTSTACKNODE", "endpointId": "", "isExternal": "true", "privateKeyId": "root"}' where name = 'vssc_idm'; 5. Update endpoint_properties column value where you have hostname set to URL to FQDN. Almost same as above provisioning-db=# update endpoint_state set endpoint_properties = ' {"hostName": "FQDN-SALTSTACKNODE", "privateKeyId": "root"}' where name = 'vssc_idm'; Now add "Running Environment" and then validate. You should see a successful validation in place

  • Implementing workaround to remediate CVE-2021-44228 for vRealize LogInsight 8.2 - 8.6 versions

    Here's the PDF document of the same instructions Note: The content of this blog is same as in KB: 87089 but with screenshots and expected outputs to make things easier Purpose CVE-2021-44228 has been determined to be present in vRealize Log Insight 8.2 - 8.6 via the Apache Log4j open source component it ships This vulnerability and its impact on VMware products are documented in the following VMware Security Advisory (VMSA), please review this document before continuing: CVE-2021-44228 - VMSA-2021-0028 Resolution The workarounds described in this document are meant to be a temporary solution only. Upgrades documented in the aforementioned advisory should be applied to remediate CVE-2021-44228 when available Workaround To apply the workaround for CVE-2021-44228 to vRealize Log Insight, perform the following steps: For each vRealize Log Insight node: step:1 Download and Copy the li-log4j-fix.sh script or file to /tmp directory step:2 SSH to the node or use Console by pressing Alt+F1 and login as root and then change or browse to /tmp where the script has been copied cd /tmp step:3 List the files to see li-log4j-fix.sh script present step:4 Run below command to make this executable chmod +x /tmp/li-log4j-fix.sh Once executed , you would see that the permissions of the file change step:5 Next step is to EXECUTE the script root@li [ /tmp ]# ./li-log4j-fix.sh Hardening Log Insight appliance against CVE-2021-44228. For more information refer to: https://www.tenable.com/cve/CVE-2021-44228. Patching Log Insight Java options: /etc/default/loginsight... SUCCESS Patching Cassandra Java options: /usr/lib/loginsight/application/lib/apache-cassandra-*/conf/jvm.options... SUCCESS Patching Tomcat Java options: /usr/lib/loginsight/application/3rd_party/apache-tomcat-*/bin/catalina.sh... SUCCESS ATTENTION: Please restart Log Insight service for the patch to take effect. step:6 Once done perform a LogInsight service restart service loginsight restart Wait for few seconds till vRealize LogInsight is fully up NOTE: Since i have a standalone node for vRealize LogInsight , there was no need for me to upload and implement patch on other nodes. if there are multiple nodes in your environment then these steps have to be followed on each node one after another Ensure the LogInsight services are completely up and running before proceeding to the next server Validation To verify the workaround for CVE-2021-44228 has been correctly applied to vRealize Log Insight, perform the following steps: Log into each node as root via SSH or Console, pressing ALT+F1 in a Console to log in Run the following command to verify if the workaround was successful: ps axf | grep --color log4j2.formatMsgNoLookups | grep -v grep Note: There should be a output from the above command. If there was no output on any particular node(s), that node(s) was not successfully modified Re-run the script on that node(s) following the instructions above

  • Fetching vRLCM API Token and Executing API's using Postman

    Every LCM Installation comes with inbuilt swagger documentation which can be accessed at https://<>/api endpoint of the LCM installation. Please refer to the swagger documentation for the APIs mentioned below on details of the REST methods. The swagger API page provides an example of payloads and also allows the user to try the APIs from the web page itself. Authentication All the APIs need a valid authentication token to be provided in the header As a first step one has to create an environment in Postman You would have to mention the vRealize Suite Lifecycle Manager URL, Username and the Password Once done to fetch the token we should execute a POST call to https://{{lcmurl}}/lcm/authzn/api/login Then when we click on "SEND" the respond body would say as login successfully When you click on headers, The value what you see under Set-Cookie is your token Now using the cookie fetched above, I will now be able to execute API calls to perform certain functions on vRSLCM 8.x In the below example I am fetching certificates from locker using API call. I used the session ID generated from the above example as Set-Cookie value inside the call. https://{{lcmurl}}/lcm/locker/api/certificates In the same way, one can fetch passwords from the locker as well. Shall add more examples soon.

  • Changing the IP Address of vRLCM after it's deployment

    Is there a use-case where one needs to change the IP Address of vRLCM after it's already deployed and running. Let's discuss the procedure Take Snapshot of vRLCM ( No Memory and No Queiscing ) Modify existing DNS records to accommodate new IP Address SSH to existing vRLCM appliance and execute below command which would assist in changing network configuration of this appliance /opt/vmware/share/vami/vami_config_net Press 0 to show existing configuration Press 6 to change the IP configuration as needed Note: Your session will be disconnected as soon as it says "Reconfiguring eth0" which is expected as the IP Address changed Open a new session again and execute the same command to display the new network configuration You may even check by rebooting vRLCM Appliance to see if it comes back properly, which it anyways does. Also, go to system settings and check the new configuration

  • Workaround to address CVE-2021-44228 in vRealize Suite Lifecycle Manager 8.x with screenshots

    Steps mentioned in this article are taken from VMware's KB article: 87097 . The only difference is that this blog has screenshots which would be helpful while implementing the patch I've documented these steps with screenshots and outputs in this PDF too. Click to download and see detailed outputs which would be available when workaround is implemented NOTE: If you have deployed vRSLCM on earlier releases of 8.x that's 8.0 or 8.1 then there is a chance that we have a left over file with the name " vmlcm-service-8.1.x-SNAPSHOT.jar " or " vmlcm-service-8.0.x-SNAPSHOT.jar " then the workaround will fail with the message " vRSLCM services jar does not exist " To fix this issue , move this old file to a different location and then execute the script The structure of the folder would always be like below. The one vmlcm-service-8.6.0-SNAPSHOT.jar indicates that this is the current file the service uses. It has version indicator too. Remember , the workaround works only for version vRSLCM 8.2 onwards.... Details CVE-2021-44228 has been determined to impact vRealize Suite Lifecycle Manager 8.2.0 - 8.6.x via the Apache Log4j open source component it ships. This vulnerability and its impact on VMware products are documented in the following VMware Security Advisory (VMSA), please review this document before continuing. CVE-2021-44228 - VMSA-2021-0028 Resolution The workarounds described in this document are meant to be a temporary solution only. Upgrades documented in the aforementioned advisory should be applied to remediate CVE-2021-44228 when available. Workaround Step:1 Take a snapshot of vRealize Suite Lifecycle Manager appliance as shown below. Once can take snapshot from vCenter UI too Step:2 Download and Copy the attached log4jfix.sh file from VMware's KB article: 87097 to the /tmp directory of vRSLCM appliance Step:3 Change to the /tmp directory cd /tmp Run the following command to make the log4jfix.sh script executable: chmod +x log4jfix.sh Step:4 Then execute the script as shown below * indicates there are other lines in between. Detailed output is present in the PDF document attached root@lcm [ /tmp ]# ./log4jfix.sh Get the version of jar vRSLCM version: 860 Blackstone version: 861 Archive: vmlcm-service-8.6.0-SNAPSHOT.jar creating: META-INF/ inflating: META-INF/MANIFEST.MF creating: org/ creating: org/springframework/ creating: org/springframework/boot/ creating: org/springframework/boot/loader/ inflating: org/springframework/boot/loader/Launcher.class inflating: org/springframework/boot/loader/JarLauncher.class creating: org/springframework/boot/loader/archive/ inflating: org/springframework/boot/loader/archive/JarFileArchive$JarFileEntry.class creating: org/springframework/boot/loader/data/ * * * extracting: BOOT-INF/lib/spring-plugin-core-1.2.0.RELEASE.jar extracting: BOOT-INF/lib/spring-plugin-metadata-1.2.0.RELEASE.jar extracting: BOOT-INF/lib/mapstruct-1.2.0.Final.jar extracting: BOOT-INF/lib/springfox-swagger-ui-2.9.2.jar updating: BOOT-INF/classes/log4j2.xml zip warning: Local Entry CRC does not match CD: BOOT-INF/classes/log4j2.xml (deflated 60%) test of vmlcm-service-8.6.0-SNAPSHOT.jar OK Zip action: 0 Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Waiting for vRLCM services to start. Archive: blackstone-external-8.6.1.jar creating: META-INF/ inflating: META-INF/MANIFEST.MF creating: org/ creating: org/springframework/ creating: org/springframework/boot/ creating: org/springframework/boot/loader/ inflating: org/springframework/boot/loader/Launcher.class inflating: org/springframework/boot/loader/JarLauncher.class * * * extracting: BOOT-INF/lib/springfox-swagger-ui-2.9.2.jar extracting: BOOT-INF/lib/log4j-core-2.8.2.jar extracting: BOOT-INF/lib/log4j-api-2.8.2.jar updating: BOOT-INF/classes/log4j2.xml zip warning: Local Entry CRC does not match CD: BOOT-INF/classes/log4j2.xml (deflated 61%) test of blackstone-external-8.6.1.jar OK Zip action: 0 Waiting for Blackstone services to start. Waiting for Blackstone services to start. Waiting for Blackstone services to start. Waiting for Blackstone services to start. The script is now implemented. It will approximately take 5 to 8 minutes to complete

  • Upgrading to vRealize Automation 8.8.1

    vRealize Automation 8.8.1 was released last evening and here's my experience in implementing in my lab. I've attached Upgrade runbook vRA 8.8.1 Deep-Dive.pdf document which contains all of the steps i've taken for a successful upgrade. Watch this space for a video which would explain the whole upgrade process in detail soon

bottom of page