\nEdge & Node stands as the revolutionary vanguard of web3, a vision of a world powered by individual autonomy, shared self-sovereignty and limitless collaboration. Established by trailblazers behind The Graph, weโre on a mission to make The Graph the internetโs unbreakable foundation of open data. Edge & Node invented and standardized subgraphs across the industry, solidifying The Graph as the definitive way to organize and access blockchain data. Utilizing a deep expertise in developing open-source software, tooling, and protocols, we empower builders and entrepreneurs to bring unstoppable applications to life with revolutionary digital infrastructure.\n\nEdge & Node acts on a set of unwavering principles that guide our journey in shaping the future. We champion a decentralized internetโfree from concentrated powerโwhere collective consensus aligns what is accepted as truth, rather than authoritative dictation. Our commitment to censorship resistance reinforces our vision of an unyielding information age free from the grasp of a single entity. By building for open-source, we challenge the stagnant landscape of web2, recognizing that true innovation thrives in transparency and collaboration. We imagine a permissionless future where the shackles imposed by central gatekeepers are not only removed, but relegated to the dustbin of a bygone era. And at the foundation of it all, our trust shifts from malevolent middlemen to trustless systems, leveraging smart contracts to eliminate the age-old vulnerabilities of misplaced trust.\n\nThe Data Science team works closely with teams across Edge & Node to deliver high quality data for product research & development and go to market, as well as business analytics. We work across the data lifecycle from infrastructure to data analytics. \n\nWe are looking for an early-career Data Engineer to be focused on developing and maintaining data science pipelines. Ideally, the team would like to bring on someone who has experience with the current tools being used by the team which include, but are not limited to, Redpanda, Materialize, and GCP. In this role, you will monitor and maintain reliability of the Redpanda cluster, streaming database, DBT jobs, QoS oracle, and other data engineering systems. Youโll be expected to learn Materialize and help migrate BigQuery models to reduce costs. In addition, you will help establish and maintain good standards around documentation and internal educational tools and respond to data engineering/devops requests in our incident management process.\nWhat Youโll Be Doing\n\n\n* Learning our infrastructure and data engineering toolset\n\n* Partnering closely with our Data Science and SRE teams to perform various data warehouse jobs and periodic RedPanda/streaming database devops tasks\n\n* Manage historical data models in BigQuery/DBT\n\n* Develop pipelines to support dashboards and perform devops tasks to support dashboards \n\n\n\nWhat We Expect\n\n\n* Experience with one or more of the following: BigQuery, ETL automation/workflow tools (DBT), BI/dashboarding tools (Apache Superset/Metabase), streaming data platforms (Apache Kafka, Redpanda, or Confluent), or other data engineering and data warehouse toolsets/environments \n\n* Some experience or knowledge of container orchestration tools such as Kubernetes and Kustomize preferred\n\n* Some experience or knowledge of monitoring and alerting (Grafana dashboards) preferred\n\n* Some experience or knowledge of SQLโable to create and manage tables within a SQL database\n\n* Proficiency in one or more programming languages, such as Python, R, or Rust \n\n* Must be able to to serve on-call shifts and support devops needs\n\n* Ability to create documentation and communicate with a a variety of audiences\n\n* Clear communication skills (written and verbal) to document processes and architectures\n\n* Ability to work well within a multinational team environment\n\n* Preference to be physically located in The Americas, however the team is open to candidates in European time zones or other locations\n\n\n\nAbout the Graph \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to DevOps, JavaScript, Node and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.