Your company has an application that is running on multiple instances of Compute Engine. It generates 1 TB per day of logs. For compliance reasons, the logs need to be kept for at least two years. The logs need to be available for active query for 30 days. After that, they just need to be retained for audit purposes. You want to implement a storage solution that is compliant, minimizes costs, and follows Google-recommended practices. What should you do?
The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.
The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.
The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.
In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.
If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.
Therefore, the correct answer is as follows
1. Install the Cloud Logging agent on all instances.
Create a sync that exports the logs to the region's Cloud Storage bucket.
3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.
4. set up a bucket-level retention policy using bucket locking.'
Your company has an enterprise application running on Compute Engine that requires high availability and high performance. The application has been deployed on two instances in two zones in the same region m active passive mode. The application writes data to a persistent disk in the case of a single zone outage that data should be immediately made available to the other instance in the other zone. You want to maximize performance while minimizing downtime and data loss. What should you do?
For this question, refer to the TerramEarth case study. You are building a microservice-based application for TerramEarth. The application is based on Docker containers. You want to follow Google-recommended practices to build the application continuously and store the build artifacts. What should you do?
Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?
https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring
You are configuring the cloud network architecture for a newly created project m Google Cloud that will host applications in Compote Engine Compute Engine virtual machine instances will be created in two different subnets (sub-a and sub-b) within a single region
* Instances in sub-a win have public IP addresses
* Instances in sub-b will have only private IP addresses
To download updated packages, instances must connect to a public repository outside the boundaries of Google Cloud You need to allow sub-b to access the external repository. What should you do?
Tegan
2 days agoDulce
17 days agoGearldine
18 days agoArlean
1 months agoPansy
2 months agoRyan
2 months agoHarrison
2 months agoGail
2 months agoRaina
2 months agoAlease
3 months agoChantay
3 months agoHyun
3 months agoGayla
3 months agoJamal
4 months agoVeronika
4 months agoMalcolm
4 months agoArthur
4 months agoBilli
4 months agoChaya
5 months agoKarma
5 months agoJani
5 months agoCharlene
5 months agoAndree
5 months agoLilli
6 months agoCandida
6 months agoRodolfo
6 months agoAlline
6 months agoJeniffer
7 months agoShaun
7 months agoDeandrea
8 months agoBlossom
8 months agoLavonne
9 months agoOren
9 months agoKattie
9 months ago