Become Google Certified with updated Professional-Cloud-Developer exam questions and correct answers
A CI/CD pipeline is being developed, which includes a version control system, Cloud Build, and Container Registry. Whenever a new tag is added to the repository, a Cloud Build job is activated. This job performs unit tests on the latest code, creates a new Docker container image, and saves it to the Container Registry. The final phase of the pipeline is to deploy the updated container to the production Google Kubernetes Engine (GKE) cluster.
What options are available to choose a tool and deployment strategy that satisfies these conditions:
• No downtime should be experienced
• Automated testing should be comprehensive
• Testing before release should be possible
• Quick rollbacks should be supported
Your company is planning to migrate their on-premises Hadoop environment to the cloud. Increasing storage cost and maintenance of data stored in HDFS is a major concern for your company. You also want to make minimal changes to existing data analytics jobs and existing architecture. How should you proceed with the migration?
Working alongside the CI/CD team, you, as a developer, are tasked with identifying and resolving a problem that has arisen with a new feature that your team has recently introduced. The CI/CD team employed HashiCorp Packer to generate a fresh Compute Engine image from your development branch, which proved successful in its creation. However, the image is currently unable to start up and requires investigation by the CI/CD team.
What actions are recommended?
Azgomi is a community application designed to facilitate communication between people in close proximity. It is used for event planning, organizing sporting events, and for businesses to connect with their local communities. Azgomi launched recently in a few neighborhoods in Dallas and is rapidly growing into a global phenomenon. Its unique style of hyper-local community communication and business outreach is in demand around the world.
The company's executive statement is to take their local community services global. Azgomi wants to expand their existing service, with updated functionality, in new regions to better serve their global customers. They want to hire and train a new team to support these regions in their time zones. They will need to ensure that the application scales smoothly and provides clear uptime data.
Azgomi's environment is a mix of on-premises hardware and infrastructure running in Google Cloud Platform. The Azgomi team understands their application well, but has limited experience in global scale applications. Existing APIs run on Compute Engine virtual machine instances hosted in GCP. State is stored in a single instance MySQL database in GCP. Data is exported to an on-premises Teradata/Vertica data warehouse. Data analytics is performed in an on-premises Hadoop environment. The application has no logging. There are basic indicators of uptime; alerts are frequently fired when the APIs are unresponsive.
Azgomi's investors want to expand their footprint and support the increase in demand they are seeing. Their requirements are to expand availability of the application to new regions, increase the number of concurrent users that can be supported, ensure a consistent experience for users when they travel to different regions, obtain user activity metrics to better understand how to monetize their product, ensure compliance with regulations in the new regions (for example, GDPR), reduce infrastructure management time and cost, and adopt the Google-recommended practices for cloud computing.
Technical requirements include providing usage metrics and monitoring, APIs requiring strong authentication and authorization, increased logging, and data stored in a cloud analytics platform. The application and backend must be moved to a serverless architecture to facilitate elastic scaling. The authorized access to internal apps should be provided in a secure manner. Azgomi has connected their Hadoop infrastructure to GCP using Cloud Interconnect in order to query data stored on persistent disks.
What would be the most appropriate IP strategy for them to implement?
© Copyrights DumpsCertify 2025. All Rights Reserved
We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsCertify.