Data Engineer (Scala/PostgreSQL/Kafka/Spark)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


As an introductory note on this particular application, our new architecture for this product has been the subject of multiple conference talks in the PostgreSQL community, and leverages PostgreSQL as a query engine which queries parquet files (via our own foreign data wrapper) generated by Spark jobs, allowing us to scale storage and query engines separately.  We would certainly give preference to candidates who have strong PostgreSQL experience in this regard as well as the Scala/Spark side.

Data Engineer


Adjust is a fast-growing mobile marketing analytics company. We build business intelligence for mobile apps, placing a high premium on scientific statistics, hand-in-glove UX, and lean, pragmatic product iteration. We enable marketers to understand how their marketing campaigns are performing.

We are looking for a Data Engineer to join our Development Team in Berlin.

What we offer you:  

- A competitive salary

- Flexibility in work schedule

- Relocation assistance

- An international team with strong focus on transparency

- Regular team gatherings and company retreats

- An opportunity to do office exchanges in other satellite locations

- Additional perks such as Friday team lunches and free access to our company gym

More details about our company culture and perks can be found on our careers page.


Your role:

As Data Engineer you will be responsible for ensuring rapid ingestion of data into our retargeting platform, called AudienceBuilder. You will help us to scale from tens of TBs into multiple PBs of data while ensuring maintainable, performant code. You will build and maintain queryable very large data sets using Spark, Parquet, and PostgreSQL which includes ingesting and compacting data in order to save space and improve query performance. When a new feature is developed, you will translate business requirements into technical specifications, and implement those. You will analyze performance bottlenecks and optimize accordingly.


Your tasks:

- Design review and ensure scalability to hundreds of thousands of events ingested per second

- Implement, test, and document components to ingest data to support new features

- Provide escalation support for the data ingestion portion of the platform

- Work closely with our operations team to develop maintenance and operational procedures as well as escalation paths


Your profile:

- Experience in development in a distributed environment

- Solid knowledge of Scala and/or Java

- Experience with Spark

- Flink and PostgreSQL experience is a plus


Interested? Let’s Talk!

Application link: 

Best Regards,
Chris Travers
Head of Database

Tel: +49 162 9037 210 | Skype: einhverfr | 
Saarbrücker Straße 37a, 10405 Berlin

[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql Admin]     [PHP Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux