Skip to main content
Skip table of contents

Upgrade Guide CX-4.4.10 to CX-4.7-inprogress-not to publish as 4.7

Before upgrading, ensure that the system is idle, i.e., all agents are logged out from the AgentDesk.
Make sure the system is idle for 30 minutes, to sync the reporting data

  1. Clone the CX repository on the target server

    CODE
    # Create CX-4.7 directory from root
    mkdir CX-4.7
    # Navigate to CX-4.7
    cd CX-4.7
    # Clone the CX-4.7 branch of cim-solution repository
    git clone -b CX-4.7 https://efcx:RecRpsuH34yqp56YRFUb@gitlab.expertflow.com/cim/cim-solution.git
    # Navigate to root(previous) directory
    cd ..
  2. Stop all core components.

    CODE
    # Navigate to the following folder of the existing release i.e. CX-4.5.1
    cd cim-solution/kubernetes
    kubectl delete -f cim/Deployments
    kubectl delete -f cim/StatefulSet/ef-amq-statefulset.yaml
    
    
    # Stop reporting connector
    kubectl delete -f pre-deployment/reportingConnector/ef-reporting-connector-cron.yaml -n expertflow
    # delete configmap for reporting connector 
    kubectl -n expertflow delete configmap ef-reporting-connector-conf
    
    # Stop team-announcement cron job
    kubectl delete -f pre-deployment/team-announcement/ef-team-announcement-cronjob.yaml -n expertflow
  3. Enable TLS and Auth using the following guide TLS Enablement for Stateful Components

  4. Update MongoDB Application Data

    1. Copy the upgrade script to the Mongo container

      CODE
      kubectl -n ef-external cp scripts/mongo/4.4-4.5_upgrade.js mongo-mongodb-0:/tmp/4.4-4.5_upgrade.js
    2. Execute the script

      CODE
      kubectl -n ef-external exec -ti mongo-mongodb-0 -- mongosh --file /tmp/4.4-4.5_upgrade.js
  5. Update the AMQ deployment.

    CODE
    1) Open cim/StatefulSet/ef-amq-statefulset.yaml file.
    2) Update AMQ Tag 
       gitimages.expertflow.com/general/activemq-k8s:6.0.0-alpine-zulu-K8s-4.6_f-CIM-15619-4.7                                              
    3) Add following new environment variables
               - name: REDIS_CLIENT_CERT
                 value: "/redis/tls.crt"
               - name: REDIS_CLIENT_KEY
                 value: "/redis/tls.key"
               - name: REDIS_CA_CERT
                 value : "/redis/ca.crt"
               - name: TRUST_STORE_PASSWORD
                 value: "Expertflow123"
               - name: KEY_STORE_PASSWORD
                 value: "Expertflow123"
               - name: ACTIVEMQ_OPTS_MEMORY
                 value: "-Xms512M -Xmx4G"
               - name: MAX_CONNECTIONS
                 value: "5000"
    4) Update following exiting environment variables
               - name: REDIS_SSL_ENABLED
                 value: "true"
               - name: REDIS_MAX_ACTIVE
                 value: "200"
               - name: REDIS_MAX_IDLE
                 value: "200"
               - name: REDIS_MIN_IDLE
                 value: "100" 
    5) Add following Volumes "redis-crt"
    spec:
      replicas: 1
      serviceName: ef-amq-svc
      selector:
        matchLabels:
          app: ef-amq
      template:
        metadata:
          labels:
            app: ef-amq
            ef: expertflow         
        spec:
          volumes:
          # Add following lines
          - name: redis-crt
            secret:
              secretName: redis-crt                         
    6) Add following VolumeMounts "redis-crt"
    
             env:
               - name: ACTIVEMQ_OPTS_MEMORY
                 value: "-Xms512M -Xmx4G"
               - name: MAX_CONNECTIONS
                 value: "5000"
             volumeMounts:
              - name: activemq-data
                mountPath: "/opt/activemq/data"
    # Add following lines            
              - name: redis-crt
                mountPath: "/redis"
    7) Apply ActiveMQ deployment 
      kubectl apply -f cim/StatefulSet/ef-amq-statefulset.yaml
  1. Update the ConfigMaps.

    CODE
     # Update 360 Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-360-connector-configmap.yaml  file.
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.connector360.utility.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
         MASKING_LAYOUT_CLASS: com.ef.connector360.utility.MaskingPatternLayout
         
     # Update agent Manager Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-agent-manager-configmap.yaml  file.
      2) Add following new environment variables
         FINESSE_URL: https://uccx12-5p.ucce.ipcc:8445
         SOCKET_DISCONNECT_TIME: "10000"
      3) update the following environment variables
         LOG_LEVEL: debug
         
     # Update Bot framework Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-bot-framework-configmap.yaml   file.
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.botframework.commons.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: https://ef-file-engine-svc:8443
         
     # Update CCM Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-ccm-configmap.yaml   file.
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.ccm.utils.MaskingPatternLayout
    
    # update Common Enviroment Variable Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-common-environment.yaml   file.
      2) Add following new environment variables
         DEFAULT_ROOM_NAME: CC
         DEFAULT_ROOM_DESCRIPTION: Contact Center Room
         DEFAULT_ROOM_LABEL: CC
         ROOM_IS_USER_ID: "false"
         CONVERSATION_SEARCH_WINDOW_HRS: "24"
         MASK_ATTRIBUTES_PATH: /sensitive.js
         LOGGING_CONFIG: /logback/logback-spring.xml
         IS_ENABLED_2FA: "false"
         CHANNEL_2FA: "app"
         TWILIO_SID: ""
         TWILIO_VERIFY_SID: ""
         TWILIO_AUTH_TOKEN: ""
      3) update the following environment variables
         TZ: UTC
         
    # update Connection Enviroment  Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-connection-env-configmap.yaml   file.
      2) Add following new environment variables
         MONGODB_CLIENT_CERT: /mongo/client-pem
         MONGODB_CA_CERT: /mongo/mongodb-ca-cert
         TRUST_STORE_PASSWORD: "Expertflow123"
         KEY_STORE_PASSWORD: "Expertflow123"
         REDIS_CLIENT_CERT: /redis/tls.crt
         REDIS_CLIENT_KEY: /redis/tls.key
         REDIS_CA_CERT: /redis/ca.crt
      3) update the following environment variables
        From MONGODB_ENABLE_SSL: "false" to MONGODB_ENABLE_SSL: "true"
        From REDIS_SSL_ENABLED: "false" to REDIS_SSL_ENABLED: "true"
        From MONGO_HOST: mongodb://mongo-mongodb.ef-external.svc.cluster.local to MONGO_HOST: mongo-mongodb.ef-external.svc.cluster.local
        From MONGODB_PASSWORD: da17deWHI0 to MONGODB_PASSWORD: Expertflow123
         
    # update Conversation Manager Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-conversation-manager-configmap.yaml file.
      2) Add following new environment variables
         UNIFIED_ADMIN_URL: http://ef-unified-admin-svc:3000
         MASKING_LAYOUT_CLASS: com.ef.conversationmanager.utility.MaskingPatternLayout
    
    # update Customer widget Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-customer-widget-configmap.yaml
      2) Add following new environment variables
         AUTHENTICATOR_URL: https://cim.expertflow.com/secure-link
         ENABLE_LOGO: "false"
         ADDITIONAL_PANEL: "true"
         USERNAME_ENABLED: "true"
         
    # update facebook Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-facebook-connector-configmap.yaml
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.connector.facebookconnector.utility.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
    
    # update File engine  Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-file-engine-configmap.yaml
      3) update the following environment variables
         USESSL: "false"
         
    # update Instagram Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-instagram-connector-configmap.yaml
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.connector.instagramconnector.utils.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
    
    # update Licence Manager Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-license-manager-configmap.yaml
      2) update the following environment variables
        From jdbc:postgresql://ef-postgresql.ef-external.svc.cluster.local:5432/licenseManager 
        to jdbc:postgresql://ef-postgresql.ef-external.svc.cluster.local:5432/licenseManager?sslmode=verify-ca&sslrootcert=/postgresql/ca.crt
    
    # update Real Time Reporting Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-realtime-reporting-configmap.yaml
      2) update the following environment variables
         DATASOURCE_USERNAME: elonmusk
         DATASOURCE_PASSWORD: 68i3nj7t
         UNIFIED_ADMIN_URL: http://ef-unified-admin-svc:3000
    
    # update Routing engine Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-routing-engine-configmap.yaml file.
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.mediaroutingengine.global.utilities.MaskingPatternLayout  
         IS_QUEUE_PRIORITY_ENABLED: "false"
         
    # Update Scheduled Activities Configmap
      1) kubectl delete -f cim/ConfigMaps/ef-scheduled-activities-configmap.yaml
      2) Open cim/ConfigMaps/ef-scheduled-activities-configmap.yaml
      3) Delete following environment variables
         DB_DIALECT: org.hibernate.dialect.PostgreSQLDialect
         DB_PASSWORD: Expertflow123
         DB_URL: jdbc:postgresql://ef-postgresql.ef-external.svc.cluster.local:5432/scheduler
         DB_USERNAME: sa
      4) Add following new environment variables
        CACHED_OUTBOUND_MESSAGE_EXPIRY_IN_SECONDS: "86400"
        THIRD_PARTY_COMPONENTS_NEXT_RETRY_TIMEOUT_DURATION_IN_SECONDS : "600"
        THIRD_PARTY_COMPONENTS_RETRY_COUNT_ATTEMPTS_LIMIT : "5"  
    
    
    # update SMPP Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-smpp-connector-configmap.yaml file.
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.sms.utils.MaskingPatternLayout 
    
    # update State Event logger Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-state-events-logger-configmap.yaml  file.
      2) update the following environment variables
         TOPIC_NAME: VirtualTopic.STATE_CHANNEL
         BATCH_TIMER_IN_SECONDS: "5"
         MAX_IDLE_CYCLES: "10" 
    
     # update Telegram Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-telegram-connector-configmap.yaml  file.     
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.spring.util.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080     
    
     # update twillio Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-twilio-connector-configmap.yaml  file.     
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.twilio.connector.util.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
    
    # update  twitter Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-twitter-connector-configmap.yaml  file.
      2) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
    
    # update unified admin Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-unified-admin-configmap.yaml file.
      2) Add following new environment variables
         SURVEY_NODERED_URL: https://devops.ef.com/survey-studio
         SURVEY_API_URL: https://devops.ef.com/survey-backend
         CAMPAIGN_NODERED_URL: https://devops.ef.com/campaign-studio
         CAMPAIGN_API_URL: https://devops.ef.com/campaigns
         SURVEY_NODERED_URL: https://<FQDN>/survey-studio
         SURVEY_API_URL: https://<FQDN>/survey-backend
         CAMPAIGN_NODERED_URL: https://<FQDN>/campaign-studio
         CAMPAIGN_API_URL: https://<FQDN>/campaigns
           
    # update unified agent Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-unified-agent-configmap.yaml  file.
      2) Add following new environment variables
         isCrmEventsEnabled: "false"
         SECURE_LINK_URL: https://devops.ef.com/secure-link
         finesseURLForAgent: https://122.129.75.138:8445
         ENABLE_SECURE_LINK: "false"
         Enable_Voice_Events_For_CRM: "false"
         IS_FINESSE_HA_ENABLED: "false"
         SECONDARY_FINESSE_URL: https://finesse12-5.ucce.ipcc:8445
         CISCO_SERVICE_IDENTIFIER: "0000"
         SIP_MONITORING_DN: "*44"
      3) update the following environment variables
         SIP_SOCKET_URL: wss://192.168.1.17:7443
         SIP_URI: 192.168.1.17
      4) Copy Agent desk translation files 
      5) From CX-4.7/cim-solution/kubernetes/pre-deployment/app-translations/unified-agent/i18n 
         To pre-deployment/app-translations/unified-agent
      6) Delete and Create the ConfigMap
        kubectl delete cm ef-app-translations-cm -n expertflow
        kubectl -n expertflow create configmap ef-app-translations-cm --from-file=pre-deployment/app-translations/unified-agent/i18n/  
      7) Copy Canned messages 
        From CX-4.7/cim-solution/kubernetes/pre-deployment/app-translations/unified-agent/canned-messages/canned-messages.json  
        To pre-deployment/app-translations/unified-agent/canned-messages
      8) Delete and Create the ConfigMap
        kubectl delete cm ef-canned-messages-cm -n expertflow
        kubectl -n expertflow create configmap ef-canned-messages-cm --from-file=pre-deployment/app-translations/unified-agent/canned-messages  
         
     # update Viber Connector Configmap
      1) Open kubernetes/cim/ConfigMaps/ef-viber-connector-configmap.yaml  file.     
      2) Add following new environment variables
         MASKING_LAYOUT_CLASS: com.ef.connector.utils.MaskingPatternLayout
      3) update the following environment variables
         FILE_ENGINE_URL: http://ef-file-engine-svc:8080
    
    # Update the Team announcement configmap
      1) kubectl delete -f cim/ConfigMaps/ef-team-announcement-configmap.yaml
      2) Open cim/ConfigMaps/ef-team-announcement-configmap.yaml
      3) Add following new environment variables
         UNIFIED_ADMIN_URL: http://ef-unified-admin-svc:3000          
  2. Add the logback configmap file

    CODE
     1) Copy logback directory 
       from CX-4.7/cim-solution/kubernetes/pre-deployment/logback to current release's kubernetes/pre-deployment   
    2) Create configMap
       kubectl apply -f pre-deployment/logback/
       kubectl -n expertflow create configmap ef-logback-cm --from-file=pre-deployment/logback/logback-spring.xml
  3. Update Conversation Controller training.

    CODE
     # Update custom training
      1) Delete controller configmaps
          kubectl -n expertflow delete configmap ef-conversation-controller-actions-cm 
          kubectl -n expertflow delete configmap ef-conversation-controller-actions-pycache-cm 
          kubectl -n expertflow delete configmap ef-conversation-controller-actions-utils-cm
      2) Copy conversation-Controller folder  
          Copy CX-4.7/cim-solution/kubernetes/pre-deployment/conversation-Controller into current release
          cim-solution/kubernetes/pre-deployment/conversation-Controller
      3) Create configMaps
          kubectl -n expertflow create configmap ef-conversation-controller-actions-cm --from-file=pre-deployment/conversation-Controller/actions
          kubectl -n expertflow create configmap ef-conversation-controller-actions-utils-cm --from-file=pre-deployment/conversation-Controller/utils
          kubectl -n expertflow create configmap ef-conversation-controller-actions-pycache-cm --from-file=pre-deployment/conversation-Controller/__pycache__
  4. Add new component Conversation Monitor.

    CODE
    Copy the ConfigMap, Deployment, Service and Ingress for new component from  CX-4.7 to your current release
    # Add ConfigMap
      Copy from CX-4.7/cim-solution/kubernetes/cim/ConfigMaps/ef-conversation-monitor-configmap.yaml
      to cim-solution/kubernetes/cim/ConfigMaps
    # Add Service
      Copy from CX-4.7/cim-solution/kubernetes/cim/Services/ef-conversation-monitor-service.yaml
      to cim-solution/kubernetes/cim/Services
    # Add Deployment
      Copy from CX-4.7/cim-solution/kubernetes/cim/Deployments/ef-conversation-monitor-deployment.yaml
      to cim-solution/kubernetes/cim/Deployments
      1) Open cim/Deployments/ef-conversation-monitor-deployment.yaml file.
      2) Update tag gitimages.expertflow.com/cim/conversation-monitor:4.5.5_f-CIM-14675
    # Add Ingress, copy nginx or traefik as per your deployment
      For RKE2 Copy from CX-4.7/cim-solution/kubernetes/cim/Ingresses/nginx/ef-conversation-monitor-Ingress.yaml
      to cim-solution/kubernetes/cim/Ingresses/nginx
      For K3s Copy from CX-4.7/cim-solution/kubernetes/cim/Ingresses/traefik/ef-conversation-monitor-Ingress.yaml
      to cim-solution/kubernetes/cim/Ingresses/traefik
  5. Update Reporting connector

    CODE
    # Update ef-reporting-connector-cron.yaml
      1) Replace reportingConnector Directory 
         Copy from CX-4.7/cim-solution/kubernetes/pre-deployment/reportingConnector
         to cim-solution/kubernetes/pre-deployment/reportingConnector
       2) Delete reporting connector configMap
          kubectl -n expertflow delete configmap ef-reporting-connector-conf
       3) Create reporting connector configMap
        kubectl -n expertflow create configmap ef-reporting-connector-conf --from-file=pre-deployment/reportingConnector/reporting-connector.conf
  6. Run the SQL update script on top of the current historical database

    CODE
    # Run the script as per your configured DB
    # MySQL Script
    CX-4.7/cim-solution/kubernetes/pre-deployment/reportingConnector/dbScripts/dbupdate/historical_reports_db_update_script_MYSQL_4.4.10_to_4.7.sql
  7. Update following core deployments.

Update the replica count for components as per your workload.

CODE
# update deployment
1) Copy deployment folder

Copy CX-4.7/cim-solution/kubernetes/cim/Deployments into current release 
cim-solution/kubernetes/cim/Deployments
  1. Update following Services

    CODE
    # Update File engine Service
      1) kubectl delete -f kubernetes/cim/Services/ef-file-engine-service.yaml  
      2) Open kubernetes/cim/Services/ef-file-engine-service.yaml  file.
      3) update the annotation with http
         annotations:
           traefik.ingress.kubernetes.io/service.serversscheme: http
         
      4) change the port and targetPort to 8080
         ports:
         - name: ef-file-engine-svc-8080
           port: 8080
           targetPort: https-fi-m-8080
      5) kubectl apply -f kubernetes/cim/Services/ef-file-engine-service.yaml 
  2. Update following Ingress

    CODE
     FOR RKE
     
     # Update File engine Ingress
      1) kubectl delete -f kubernetes/cim/Ingresses/nginx/ef-file-engine-Ingress.yaml  
      2) Open kubernetes/cim/Ingresses/nginx/ef-file-engine-Ingress.yaml  file.
      2) Remove the following lines.
         nginx.ingress.kubernetes.io/backend-protocol: HTTPS
         nginx.ingress.kubernetes.io/proxy-body-size: 20m
         nginx.ingress.kubernetes.io/proxy-connect-timeout: 600s
         nginx.ingress.kubernetes.io/proxy-read-timeout: 600s
         nginx.ingress.kubernetes.io/proxy-send-timeout: 600s
      3) Add the following Annotation
         annotations:
           nginx.ingress.kubernetes.io/proxy-body-size: 8m
      4) change the port 
         port:                
           number: 8080
      5) kubectl apply -f kubernetes/cim/Ingresses/nginx/ef-file-engine-Ingress.yaml 
       
    # Update Scheduled Activities Ingress
      1) change the port 
         port:
           number: 8894
     # Update twitter connector Ingress
       1) change the port 
         port:
           number: 8080
     
     
      FOR K3s
    
     # Update File engine Ingress
      1) kubectl delete -f kubernetes/cim/Ingresses/traefik/ef-file-engine-Ingress.yaml
      2) Open kubernetes/cim/Ingresses/traefik/ef-file-engine-Ingress.yaml  file.
      3) change the port 
         port:                
           number: 8080
      4) kubectl apply -f kubernetes/cim/Ingresses/traefik/ef-file-engine-Ingress.yaml     
  3. Start all Deployment.

    CODE
    # Apply all ConfigMaps
    kubectl apply -f cim/ConfigMaps
    # Apply all deployments
    kubectl apply -f cim/Deployments
    
    # Apply service for conversation-monitor
    kubectl apply -f cim/Services/ef-conversation-monitor-service.yaml
    # Apply Ingress for conversation-monitor
      For RKE2
      kubectl apply -f cim/Ingresses/nginx/ef-conversation-monitor-Ingress.yaml
      For K3s
      kubectl apply -f cim/Ingresses/traefik/ef-conversation-monitor-Ingress.yaml
    
    # Apply reporting-connector cron job
    kubectl apply -f pre-deployment/reportingConnector/ef-reporting-connector-cron.yaml -n expertflow
    
    # Apply team-announcement cron job
    kubectl apply -f pre-deployment/team-announcement/ef-team-announcement-cronjob.yaml -n expertflow
  4. Upgrade guide for CX Voice.

  5. Voice Connector Upgrade guide

  6. Guide for migrating Keycloak Groups/Teams to CX Teams https://expertflow-docs.atlassian.net/l/cp/0aRTcLkk

  7. Upgrade Grafana

Grafana Upgrade Guide
  1. Navigate to the current release cim-solution/kubernetes folder

    CODE
    cd cim-solution/kubernetes 

For the following step, only run the commands for your specific reporting database type. (MYSQL / MSSQL)

  1. Upgrade the Grafana Configs

    1. If you have MySQL

      1. Copy the Supervisor_Dashboard_CIM-mysql.json and Agent_Dashboard_CIM-mysql.json file from 4.7 folder to the
        current cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards folder

        CODE
        From CX-4.7/cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards/Supervisor_Dashboard_CIM-mysql.json
        To post-deployment/config/grafana/supervisor-dashboards/
        
        From CX-4.7/cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards/Agent_Dashboard_CIM-mysql.json
        To post-deployment/config/grafana/supervisor-dashboards/
      2. Delete dashboard ConfigMap

        CODE
        kubectl -n ef-external delete configmap ef-grafana-supervisor-dashboard-mysql
        kubectl -n ef-external delete configmap ef-grafana-agent-dashboard-mysql
      3. Create dashboard ConfigMap

        CODE
        kubectl create configmap ef-grafana-supervisor-dashboard-mysql -n ef-external --from-file=post-deployment/config/grafana/supervisor-dashboards/Supervisor_Dashboard_CIM-mysql.json
        kubectl create configmap ef-grafana-agent-dashboard-mysql -n ef-external --from-file=post-deployment/config/grafana/supervisor-dashboards/Agent_Dashboard_CIM-mysql.json
    2. If you have MSSQL

      1. Copy the Supervisor_Dashboard_CIM-mssql.json and Agent_Dashboard_CIM-mssql.json file from 4.7 folder to the current cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards folder

        CODE
        From CX-4.7/cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards/Supervisor_Dashboard_CIM-mssql.json
        To post-deployment/config/grafana/supervisor-dashboards/
        
        From CX-4.7/cim-solution/kubernetes/post-deployment/config/grafana/supervisor-dashboards/Agent_Dashboard_CIM-mssql.json
        To post-deployment/config/grafana/supervisor-dashboards/
      2. Delete dashboard ConfigMap

        CODE
        kubectl -n ef-external delete configmap ef-grafana-supervisor-dashboard-mssql
        kubectl -n ef-external delete configmap ef-grafana-agent-dashboard-mssql
      3. Create dashboard ConfigMap

        CODE
        kubectl create configmap ef-grafana-supervisor-dashboard-mssql -n ef-external --from-file=post-deployment/config/grafana/supervisor-dashboards/Supervisor_Dashboard_CIM-mssql.json
        kubectl create configmap ef-grafana-agent-dashboard-mssql -n ef-external --from-file=post-deployment/config/grafana/supervisor-dashboards/Agent_Dashboard_CIM-mssql.json
  2. Reinstall Grafana

    1. Uninstall Grafana

      CODE
      helm uninstall grafana -n ef-external
    2. Install Grafana

      CODE
      helm upgrade --install=true --wait=true --timeout=10m0s --debug --namespace=ef-external --values=external/bitnami/grafana

CODE
#Replace the AgentGadget.js 
Replace the file from following path new release 
kubernetes/post-deployment/3rdPartyResources/Finesse-gadget/AgentGadget.js

#Replace the "supervisor-dashboards" folder from following path
kubernetes/post-deployment/config/grafana/supervisor-dashboards

#Replace the "unified-agent" folder from following path
kubernetes/pre-deployment/app-translations/unified-agent

#copy the folder "crm-service" from the following path
kubernetes/pre-deployment/crm-service

#Replace the "grafana" folder from following path
kubernetes/pre-deployment/grafana

Replace the  "licensemanager" folder from following path
kubernetes/pre-deployment/licensemanager

Replace the  "dbcreation" folder from following path
kubernetes/pre-deployment/reportingConnector/dbScripts/dbcreation

Replace the  "dbupdate" folder from following path
kubernetes/pre-deployment/reportingConnector/dbScripts/dbupdate
JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.