New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Databricks Certified Data Engineer Associate Exam - Topic 4 Question 37 Discussion

Actual exam question for Databricks's Databricks Certified Data Engineer Associate exam
Question #: 37
Topic #: 4
[All Databricks Certified Data Engineer Associate Questions]

A data engineer and data analyst are working together on a data pipeline. The data engineer is working on the raw, bronze, and silver layers of the pipeline using Python, and the data analyst is working on the gold layer of the pipeline using SQL The raw source of the pipeline is a streaming input. They now want to migrate their pipeline to use Delta Live Tables.

Which change will need to be made to the pipeline when migrating to Delta Live Tables?

Show Suggested Answer Hide Answer
Suggested Answer: A

When migrating to Delta Live Tables (DLT) with a data pipeline that involves different programming languages across various data layers, the migration does not require unifying the pipeline into a single language. Delta Live Tables support multi-language pipelines, allowing data engineers and data analysts to work in their preferred languages, such as Python for data engineering tasks (raw, bronze, and silver layers) and SQL for data analytics tasks (gold layer). This capability is particularly beneficial in collaborative settings and leverages the strengths of each language for different stages of data processing.

Reference: Databricks documentation on Delta Live Tables: Delta Live Tables Guide


Contribute your Thoughts:

0/2000 characters
Dahlia
3 months ago
I heard they can still use streaming sources, so D seems wrong too.
upvoted 0 times
...
Gianna
3 months ago
No way it needs to be entirely in Python! That’s not how Delta Live Tables work.
upvoted 0 times
...
Rickie
3 months ago
Wait, are they really saying it has to be all SQL? That sounds off.
upvoted 0 times
...
Francisca
4 months ago
Definitely agree with that! Mixing languages is a plus.
upvoted 0 times
...
Tanesha
4 months ago
I think the pipeline can have different notebook sources in SQL & Python.
upvoted 0 times
...
Glenna
4 months ago
I thought Delta Live Tables could work with both batch and streaming sources, so I’m leaning towards A as well.
upvoted 0 times
...
Felix
4 months ago
I practiced a similar question, and I think the streaming input is still valid with Delta Live Tables, so option D doesn't seem right.
upvoted 0 times
...
Jolanda
4 months ago
I'm not entirely sure, but I feel like the pipeline doesn't have to be written entirely in one language. That seems too restrictive.
upvoted 0 times
...
Ashton
5 months ago
I remember we discussed how Delta Live Tables can support both SQL and Python, so I think option A might be correct.
upvoted 0 times
...
Aileen
5 months ago
I've worked with Delta Live Tables before, and I know it supports both SQL and Python. My guess is that option A is the correct answer, since the pipeline can continue to use the different notebook sources.
upvoted 0 times
...
Meghan
5 months ago
Okay, let's see. The pipeline is currently using a streaming input, and Delta Live Tables is being introduced. I'm guessing the change needed has something to do with that transition from streaming to batch.
upvoted 0 times
...
Quentin
5 months ago
I'm a bit confused on this one. The question mentions using both SQL and Python, but the answer choices seem to imply we need to choose one or the other. I'll need to think this through carefully.
upvoted 0 times
...
Bettina
5 months ago
Hmm, this seems straightforward. I think the key is understanding how Delta Live Tables works and how it differs from the current pipeline setup.
upvoted 0 times
...
Arlen
5 months ago
Hmm, this one seems tricky. I'll need to think carefully about the TCP flags and how to detect a Null Scan attempt.
upvoted 0 times
...
Dudley
1 year ago
Option B, easy peasy. If it's good enough for the data engineer, it's good enough for the data analyst. Time to brush up on that SQL, my friend. Oh, and don't forget the backup sunglasses for when the code inevitably blinds you with its brilliance.
upvoted 0 times
...
Cordelia
1 year ago
Option C, no doubt! Gotta keep that Python love alive, am I right? I mean, who needs SQL when you can just write the whole pipeline in Python? It's like the '90s all over again!
upvoted 0 times
Jolanda
1 year ago
We can still use SQL for that layer, no problem.
upvoted 0 times
...
Bronwyn
1 year ago
But what about the SQL part for the gold layer?
upvoted 0 times
...
Willie
1 year ago
Agreed, Python is so much more flexible and powerful.
upvoted 0 times
...
Yasuko
1 year ago
I think we should stick with Python for the pipeline.
upvoted 0 times
...
...
King
1 year ago
Whoa, this is a tricky one! I'm going to have to go with option A. Mixing and matching notebook sources in SQL and Python sounds like a recipe for a programmer's paradise. Or a nightmare, depending on your perspective.
upvoted 0 times
...
Merissa
1 year ago
Hmm, I think option D is the way to go. Using a batch source instead of a streaming one for Delta Live Tables seems like the logical choice. After all, who needs real-time when you can have delayed data, am I right?
upvoted 0 times
Lavonna
1 year ago
Data engineer: Let's make the change and see how it improves the pipeline.
upvoted 0 times
...
Adelina
1 year ago
Data analyst: Yeah, it might be more efficient that way.
upvoted 0 times
...
Vonda
1 year ago
Data engineer: I agree, switching to a batch source for Delta Live Tables makes sense.
upvoted 0 times
...
...
Kiley
1 year ago
Actually, I think the pipeline can still have different notebook sources in SQL & Python even with Delta Live Tables.
upvoted 0 times
...
Louvenia
1 year ago
I bet the data analyst is secretly hoping for option B, just so they can lord their SQL skills over the data engineer. Classic power move!
upvoted 0 times
Marvel
1 year ago
B: I wonder if we'll have to switch to writing the pipeline entirely in SQL.
upvoted 0 times
...
Lashaun
1 year ago
A: Yeah, that sounds right. It'll be a big change for us.
upvoted 0 times
...
Temeka
1 year ago
B: I think we'll need to use a batch source instead of a streaming source.
upvoted 0 times
...
Mee
1 year ago
A: We need to migrate our pipeline to Delta Live Tables.
upvoted 0 times
...
...
Brande
1 year ago
I agree. I believe the pipeline will need to be written entirely in SQL for Delta Live Tables.
upvoted 0 times
...
Gerardo
1 year ago
Wait, what? Option B wants me to write the entire pipeline in SQL? That's a hard pass. I'm sticking with Python, thanks.
upvoted 0 times
...
Kiley
1 year ago
I think we will need to make some changes to our pipeline when migrating to Delta Live Tables.
upvoted 0 times
...
Jenise
1 year ago
Option B? Really? Forcing the whole pipeline to be in SQL seems like a bit of a stretch. This is a tough one.
upvoted 0 times
...
Azalee
1 year ago
Interesting dilemma. I wonder if they could keep the SQL and Python separation, but just wrap it all in Delta Live Tables. Guess we'll have to see what the experts say!
upvoted 0 times
Harley
1 year ago
Maybe we can find a way to integrate both Python and SQL into Delta Live Tables.
upvoted 0 times
...
Yen
1 year ago
But what about the SQL part of the pipeline? We can't just abandon that.
upvoted 0 times
...
Erick
1 year ago
I think we might need to rewrite the pipeline entirely in Python.
upvoted 0 times
...
...
Elvis
1 year ago
I'm not sure about that. The question says the data engineer is working with Python, so I think option C might be the right choice.
upvoted 0 times
Francesco
1 year ago
Let's make the necessary changes to migrate to Delta Live Tables.
upvoted 0 times
...
Jesus
1 year ago
That makes sense since the data engineer is already working with Python.
upvoted 0 times
...
Kimberely
1 year ago
I agree. I believe the pipeline will need to be written entirely in Python.
upvoted 0 times
...
Nada
1 year ago
I think we need to switch to using Delta Live Tables for our pipeline.
upvoted 0 times
...
...
Hortencia
1 year ago
Hmm, I think option D is the way to go. Migrating to Delta Live Tables likely requires using a batch source instead of a streaming one.
upvoted 0 times
Chau
1 year ago
Yeah, it's important to make sure the pipeline is compatible with Delta Live Tables.
upvoted 0 times
...
Eura
1 year ago
I agree, using a batch source would make the migration smoother.
upvoted 0 times
...
...

Save Cancel