Available for new missions

Camille Lebrun

Data Consultant & Analytics Engineer | Data Engineering & Analytics Engineering Missions | Available in Toulouse and Remote | Expert Modern Data Stack

Camille Lebrun - Data Consultant

3+

Years crafting modern data stacks

+8h

Weekly reporting time saved per project

5+

Custom data stacks deployed end-to-end

About

About

I am Camille Lebrun, a Data Consultant and Analytics Engineer based in Toulouse, France. With a dual scientific and data background (Msc in Biochemistry and Master's in Data & AI), I have 3+ years of experience as an Analytics Engineer and Data Consultant . Specializing in Data Engineering and Analytics Engineering missions, I work with clients remotely across France and internationally.

I leverage technical versatility with the Modern Data Stack (Python, SQL, dbt, BigQuery, Airflow, Kestra, data visualization) and passion for the profession to best understand my clients' needs and challenges. I can effectively support them in their data valorization projects.

Curious and adaptable, I excel at selecting the right data tools and architecture while remaining a hands-on contributor. I focus on translating complex business needs into robust, autonomous technical implementations within a team.

Functional Domains

RetailMarketingFMCGE-commerceE-retail MediaTravel IndustryFashion & Apparel

Skills

Modern Data Stack Expertise

dbt logo dbt
BigQuery logo BigQuery
Airflow logo Airflow
Kestra logo Kestra
Python logo Python
Airbyte logo Airbyte
dlt logo dlt
Data Studio (formally Looker Studio) logo Data Studio (formally Looker Studio)

FAQ

Frequently Asked Questions

How do you orchestrate a data platform ready for business teams?

I orchestrate data platforms through source audit, medallion modeling, dbt and BigQuery pipelines, then fast delivery to Data Studio (formally Looker Studio), Power BI or Streamlit for day-to-day consumption.

How do you guarantee the reliability of pipelines and dashboards?

I ensure reliability through automated dbt tests, Slack/Cloud Monitoring alerts, anomaly detection via custom SQL, and volume/freshness guards on every critical table.

Can you optimize an existing stack without starting from scratch?

Yes, I launch with a technical audit, identify quick wins (dbt refactoring, partitioning, cost controls) and document an actionable roadmap for Modern Data Stack optimization.

Which KPIs prove the impact of your data missions?

I measure impact through incident detection time (-90%), hours saved in reporting (+8h/week), data reliability (green tests) and dashboard adoption metrics.

How do you collaborate with product, marketing, or ops teams?

I collaborate through intent workshops, accessible documentation, governance rituals, and ongoing support to keep metrics and alerts transparent for all stakeholders.

Which tools do you prioritize for orchestration, governance, and reliability?

I prioritize Kestra/Airflow for orchestration, dbt for modeling, Airbyte/dlt for ingestion, Power BI or Data Studio (formally Looker Studio) for KPIs, plus Python scripts for validation and monitoring in Modern Data Stack implementations.