Our Mission
Revolutionizing OOH advertising with deep data analytics, our team transforms geolocation data into insights for brands to plan, execute, and measure campaigns. Seeking a proactive, experienced Data Engineer to build the data foundation for future Ad-Tech.
The Role
Senior/Lead Data Engineer to architect and build the company's data backbone. Own terabyte-scale data pipelines, maximizing automation, efficiency, and scalability. Collaborate with data science/analytics teams, turning prototypes into robust production data products. High-impact role making foundational technical decisions shaping the platform for years.
What You'll Do
Build/optimize PySpark/Databricks pipelines processing terabytes of real-time/batch geolocation data daily.
Automate/cost-optimize AWS data infrastructure (scalable, reliable, efficient).
Empower data science/analytics: Productionize models, build data products for campaign analysis.
Ensure data quality/privacy: Automated monitoring, rapid de-anonymization for secure datasets.
Develop core infrastructure: Golden tables, reports, dashboards (single source of truth).
Lead foundational project: Automate/re-architect core data prep platform (greenfield).
Core Qualifications (Must-Haves):
Experience: 5+ years of professional experience in data engineering for a Senior role; 7+ years with demonstrated technical leadership for a Lead role.
Python & Spark: Advanced proficiency in Python for data engineering and extensive hands-on experience with PySpark for processing large-scale datasets.
Databricks: Practical, in-depth experience with the Databricks platform, with a strong understanding of how to optimize Spark jobs for performance and cost.
SQL Mastery: Ability to write, debug, and optimize complex SQL queries.
Cloud Expertise: Strong experience with AWS cloud services, particularly S3 for data storage and SQS/SNS for messaging, serverless functions.
Data Architecture: Solid understanding of Data Lake and Data Warehouse architectures and design principles.
Orchestration & CI/CD: Experience with workflow orchestration tools like Apache Airflow (or Prefect, Dagster) and building/maintaining CI/CD pipelines (e.g., GitHub Actions).
Containerization: Foundational knowledge of Docker and Kubernetes.
Communication: Professional fluency in English, as all team communication and documentation are in English.
Nice-to-Have:
Geospatial Data: Experience with geospatial libraries and tools (e.g., GeoPandas, PostGIS).
Infrastructure as Code: Experience with Terraform.
Databases: Familiarity with columnar databases (e.g., Redshift, ClickHouse).
Monitoring & Visualization: Experience with monitoring stacks (e.g., ELK, Grafana).
Version Control: Proficiency with Git and collaborative workflows like GitFlow.
Scam Warning:
Please be aware that the billups Global Talent Acquisition team will only contact you directly from an official @billups.com email address. We do not use third-party platforms or unofficial email domains—such as careers.info—to make job offers or request personal information.
billups will never:
Ask you to provide passport information, utility bills, or other sensitive personal documents during the interview process.
Request that you purchase work equipment or send money in any form.
Make job offers through unofficial channels or email addresses not ending in @billups.com.
To be considered for any open position at billups, you must apply directly through our official website at billups.com/careers. We do not accept applications via email, social media, or third-party job boards.
If you receive any suspicious communication claiming to be from billups—especially from non-@billups.com email addresses—do not respond or share any personal or financial information.
To verify the legitimacy of any message or job opportunity, please contact us directly at careers@billups.com or visit our official website.
Your safety and security are important to us — stay alert!