DataEngPrep.tech
QuestionsPracticeAI CoachDashboardPacksBlog
ProLogin
Home/Questions/Python/Coding/Write a Python script to parse a large JSON file, filter records based on a condition, and write the result to a database.

Write a Python script to parse a large JSON file, filter records based on a condition, and write the result to a database.

Python/Codinghard0.4 min readPremium

**Why Streaming:** Loading 10GB JSON kills memory. Use JSON Lines (ndjson)—one JSON object per line; iterate line-by-line. For nested: ijson for streaming parse. **Pattern:** Open file, for line in f: obj=json.loads(line); if cond: batch.append(obj). Batch insert (executemany)...

🤖 Analyze Your Answer
Frequency
Low
Asked at 1 company
Category
179
questions in Python/Coding
Difficulty Split
127E|24M|28H
in this category
Total Bank
1,863
across 7 categories
Asked at these companies
Goldman Sachs
Key Concepts Tested
pythonsql

Why This Question Matters

This hard-level Python/Coding question appears frequently in data engineering interviews at companies like Goldman Sachs. While less common, it tests deeper understanding that distinguishes strong candidates. Mastering the underlying concepts (python, sql) will help you answer variations of this question confidently.

How to Approach This

This is a senior-level question that tests architectural thinking. Lead with the high-level design, then drill into specifics. Discuss trade-offs explicitly - there is rarely one correct answer. Show awareness of scale, fault tolerance, and operational complexity.

Expert Answer
77 words

Why Streaming: Loading 10GB JSON kills memory. Use JSON Lines (ndjson)—one JSON object per line; iterate line-by-line. For nested: ijson for streaming parse.

Pattern: Open file, for line in f: obj=json.loads(line); if cond: batch.append(obj). Batch insert (executemany) every 1000 rows. Commit in batches.

Production: Use connection pooling. Progress logging. Handle malformed lines (try/except, dead-letter). At Goldman: schema validation before insert.

for line in open(path):
obj = json.loads(line)
if cond(obj): batch.append(obj)
if len(batch) >= 1000: cursor.executemany(insert_sql, batch); batch.clear()

This answer is partially locked

Unlock the full expert answer with code examples and trade-offs

Recommended

Start AI Mock Interview

Practice real interviews with AI feedback, track progress, and get interview-ready faster.

  • Unlimited AI mock interviews
  • Instant feedback & scoring
  • Full answers to 1,800+ questions
  • Resume analyzer & SQL playground
Create Free Account

Pro starts at $24/mo - cancel anytime

Just need answers for quick revision?

Download curated PDF interview packs

Interview Packs
1,800+ real interview questions sourced from 5 top companies
AmazonGoogleDatabricksSnowflakeMeta
This answer is in the DE Mastery Vault 2026
1,863 questions with expert answers across 7 categories →

Free: Top 20 SQL Interview Questions (PDF)

Get the most asked SQL questions with expert answers. Instant download.

No spam. Unsubscribe anytime.

Related Python/Coding Questions

easyWhat are traits in Scala, and how are they different from classes?FreemediumWrite a Python function to check if a string is a palindrome.FreeeasyWhat is the difference between a list and a tuple in Python?FreeeasyExplain the difference between shallow copy and deep copy in Python.FreeeasyWrite a Python function to find the first non-repeating character in a string.Free

Want to know if YOUR answer is good enough?

Paste your answer and get instant AI feedback with a FAANG-level improved version.

Analyze My Answer — Free

According to DataEngPrep.tech, this is one of the most frequently asked Python/Coding interview questions, reported at 1 company. DataEngPrep.tech maintains a curated database of 1,863+ real data engineering interview questions across 7 categories, verified by industry professionals.

← Back to all questionsMore Python/Coding questions →