Our innovative client’s platform harnesses the Blockchain to empower users and they have received investment from a number of legendary Silicon Valley figures.
We are looking for a full-time data engineer. You will be at the core of making people’s data work for them. You will design and maintain the ETL data pipeline—from pulling and parsing data from various APIs and downloaded data stores to populating normalized RDBs and calculating cached views (usually in a NoSQL form) to power our various data products and services.
Our current stack involves; Airflow, Python, js/node, PostgreSQL, MongoDB, and AWS hosting.
What We are Looking For:
- Expertise in building out data pipelines, efficient ETL design, implementation, and maintenance
- Mastery of RDBs and ability to generate normative schemas from datasets
- Experience with NoSql dbs
- Experience building and maintaining a data warehouse in production environments
Nice to have Qualifications
- Experience with Apache Airflow, AWS tools, git, Linux.
- Experience with systems for transforming large datasets such as Spark or Hadoop
- Familiarity with Python-based data science tools (e.g., pandas) is also highly desirable.
- Highly competitive wages
- Top-of-the-line equipment: Laptop of choice, custom monitor setup, optional standing desk, etc.