Skip to content

 

San Francisco, California - USD Full Time Posted: Thursday, 11 October 2018
 
 
Job Responsibilities:
  • Data ingestion pipeline: Build our next generation streaming ingestion pipeline for scale (10x data), speed (
  • Self-service transformation engine: Build and maintain our self-service tooling that allows anybody at Coinbase to transform complex JSON and create dimensional models. Specific challenges are supporting type 2 slowly changing dimensions, end-to-end testability, validation/monitoring/alerting and efficient execution. Today we do this with Apache Airflow.
  • Anomaly detection: Build a comprehensive anomaly detection service that allows anybody at Coinbase to quickly set up notifications in order to detect process breakage.
  • Security: build a security layer that authorizes data access at the row/column level. Build a logging and auditing system in order to surface suspicious data access patterns.
REQUIREMENTS:
  • Exhibit our core cultural values: positive energy, clear communication, efficient execution, continuous learning
  • Experience building (data) Back End systems at scale with parallel/distributed compute
  • Experience building microservices
  • Experience with Python and/or Java/Scala
  • Knowledge of SQL
  • A data-oriented mindset
PREFERRED (NOT REQUIRED):
  • Computer Science or related engineering degree
  • Deep knowledge of Apache Airflow, Spark, Hadoop, Hive, Kafka/Kinesis
WHAT TO SEND:
A resume that describes scalable systems you ve built

San Francisco, California, United States of America
IT
USD
FocusKPI
FocusKPI
JS2365_9676B2DDA58EDBDAA46CF8EF192AE781/537349184
10/11/2018 9:43:59 AM

We strongly recommend that you should never provide your bank account details to an advertiser during the job application process. Should you receive a request of this nature please contact support giving the advertiser's name and job reference.

Other jobs like this

San Francisco, California
USD
San Ramon, California
Mountain View, California
USD
See more