**Why Streaming:** Loading 10GB JSON kills memory. Use JSON Lines (ndjson)—one JSON object per line; iterate line-by-line. For nested: ijson for streaming parse. **Pattern:** Open file, for line in f: obj=json.loads(line); if cond: batch.append(obj). Batch insert (executemany)...
This hard-level Python/Coding question appears frequently in data engineering interviews at companies like Goldman Sachs. While less common, it tests deeper understanding that distinguishes strong candidates. Mastering the underlying concepts (python, sql) will help you answer variations of this question confidently.
This is a senior-level question that tests architectural thinking. Lead with the high-level design, then drill into specifics. Discuss trade-offs explicitly - there is rarely one correct answer. Show awareness of scale, fault tolerance, and operational complexity.
Why Streaming: Loading 10GB JSON kills memory. Use JSON Lines (ndjson)—one JSON object per line; iterate line-by-line. For nested: ijson for streaming parse.
Pattern: Open file, for line in f: obj=json.loads(line); if cond: batch.append(obj). Batch insert (executemany) every 1000 rows. Commit in batches.
This answer is partially locked
Unlock the full expert answer with code examples and trade-offs
Practice real interviews with AI feedback, track progress, and get interview-ready faster.
Pro starts at $24/mo - cancel anytime
Get the most asked SQL questions with expert answers. Instant download.
No spam. Unsubscribe anytime.
Paste your answer and get instant AI feedback with a FAANG-level improved version.
Analyze My Answer — FreeAccording to DataEngPrep.tech, this is one of the most frequently asked Python/Coding interview questions, reported at 1 company. DataEngPrep.tech maintains a curated database of 1,863+ real data engineering interview questions across 7 categories, verified by industry professionals.