\nItโs an exciting time to join Metrika! A Series A-funded startup with teammates across the US, Canada, UK, and Europe, we are building the world's premier risk management and compliance platform for digital assets. Metrika works with financial institutions and regulators to help them identify, measure, manage and monitor any risk related to blockchain networks and digital assets.\n\n\n\n\nAs a Senior Data Engineer you will be able to contribute, influence, and take ownership in significant parts of our systems. Our goal is to build a very high performance platform, capable of analyzing thousands of data points across multiple blockchain networks in real-time.\n\n\n\n\nIf you are a Senior Data Engineer, with a solid understanding of data lakes, data warehouses, ETL, distributed systems, passion for your work and would love to work with a geographically distributed team, in an emerging industry join us! No prior experience in blockchain/digital assets is necessary, but an interest in learning and being deeply immersed is.\n\n\n\n\nWhat you'll be doing:\n\n\n\n\n* \nDesigning, implementing and maintaining data processing pipelines โ this includes ingestion, clean up, transformation, aggregation, batch and streaming jobs, as well managing the data lifecycle to ensure affordable and performant long-term storage across our data stores and data lake. You will work closely with our software engineers, SREs and our Analytics team to make sure data smoothly flows across Metrika and beyond to our customers and users.\n\n* \nWorking under a Scrum or Kanban framework.\n\n* \nOwning your work. This means being proud of your work, actively striving for excellence, observing the best practices of your craft and always aiming to improve your skill.\n\n* \nUnderstanding, participating and contributing to the company goals, regardless of your role. Metrika is a small company with a very inclusive culture. We are looking for people that share those values with us.\n\n\n\n\n\n\nPlease note: Our Engineering team is predominantly based in Europe. This position is currently open to those resident and currently able to work in the European Economic Area (EU, Norway, Liechtenstein), Switzerland, and the UK.\n\n\n\n\nMetrika Inc. is an Equal Opportunity employer. All applicants will be considered without regard for race, color, national origin, ethnicity, gender, disability, sexual orientation, gender identity, or religion.\n\n\n\nWe are looking for individuals with:\n\n* \nA Bachelor's degree in Computer Science, Electrical Engineering, Physics or Mathematics. Masters or higher degrees preferred.\n\n* \nMulti-year experience in data engineering, in large-scale production environments.\n\n* \nAt Metrika we mostly use Python for data processing: most of our ETL/Data processing jobs are written in Python. You will need to have a solid understanding of the concepts of data governance, data lineage/provenance.\n\n* \nYou will need to have prior experience with scheduling systems (e.g., Airflow), \n\n* \nProven experience with databases (SQL & NoSQL such as Postgres and MongoDB), distributed query engines (e.g. Trino) and distributed computing frameworks (e.g., Ray). \n\n* \nExcellent understanding of TDD, agile development methodology and version control.\n\n* \nThe ability to function autonomously to solve problems, and deliver working software. Our remote environment and geographic distribution requires people that can work well on their own.\n\n* \nThe ability to communicate well with your team, both interactively and asynchronously, and that of being a positive, constructive team member.\n\n\n\n\nYou'll be a great fit if you have:\n\n* \nWorked and contributed to a Big Data production environment, handling multiple GB of data per day.\n\n* \nGood knowledge of Python.\n\n* \nExperience with Trino and Airflow.\n\n* \nExperience with using and building CI/CD pipelines.\n\n* \nExperience with Docker/Kubernetes or Serverless environments.\n\n* \nExperience with SQS/SNS, Apache Kafka, RabbitMQ or other brokers.\n\n* \nExperience with public cloud providers, e.g. AWS, GCP, Azure, DigitalOcean etc.\n\n\n\n\nPerks & Benefits\n\n\n* Competitive salary and equity compensation\n\n* Medical insurance (based on location)\n\n* All-remote\n\n* Metrika offer a generous budget for your home office\n\n* Supported attendance at blockchain conferences\n\n* Budget to meet other Metrikaers in your locality, if applicable\n\n\n\n\n\n\n\nOnce you submit your application, you will receive an automated email from the recruitee.com domain within a few minutes acknowledging we have received your application. If you do not receive this email within a few minutes, please check your spam folder or other filtered folders. And to ensure our future communications reach you, please add emails from the recruitee.com domain to your safe list. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Serverless, Cloud, NoSQL, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote job
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Trunk.io is hiring a Remote Senior Analytics Data Engineer
\nAt Trunk, our mission is to help teams create high-quality software quickly. Merge conflicts, poor code quality or consistency, flaky tests, and dozens of other distractions quickly drain the productivity and morale of those teams. Engineering teams that can stay focused on designing, implementing, and delivering software will build magical, high-quality projects - and they will be happier doing it. We're building the tools that empower teams to land code faster and develop happier.\n\n\nWe are building the foundation for a modern software engineering team. Our founders started this journey in 2021 and have designed, delivered, and scaled software at some of the world's largest and fastest-growing tech companies - Uber, Google, YouTube, and Microsoft. We're building a game-changing company, and we hope you are excited to be a part of that audacious goal.\n\n\nSoftware has eaten the world; almost every company produces software in some form or fashion, so our addressable market is virtually every company on earth. We're going after every engineering team on the planet - we're starting with smaller teams, but there are literally hundreds of thousands of companies out there for us to empower and maybe only a handful (Google, Facebook, Amazon), that are outside our scope. We are building the DevEx platform to empower the world.\n\n\nIn 2022, we raised a $25M Series A led by Initialized Capital (Garry Tan) and a16z (Peter Levine), with investments from Haystack Ventures, Garage VC, Tom Preston Warner (Founder/CEO of GitHub), Geoff Schmidt (Founder/CEO Apollo GraphQL), Nicolas Dessaigne (Founder/CEO Algolia), and Oleg Rognysky (Founder/CEO Peopl.ai).\n\n\n\n\n\n\n\nWhat you'll do ๐งโ๐ป\n* Build data pipelines, text analysis algorithms, query engines, and decision making engines\n* Apply robust and fault-tolerant approaches to create scalable ingestion and data-processing systems\n* Debug, profile and optimize distributed data-intensive applicating, improving their latency, accuracy, resource consumption, and throughput\n* Work with existing applications built with Spark, S3, Timescale, Python and Rust\n* Directly implement services and features that leverage the results of your data pipelineImplement and improve machine learning and data pipelines\n\n\n\nWe're looking for ๐\n* 5+ years of experience as an engineer with a strong understanding of key concepts in distributed systems\n* 3+ years of extensive experience in building and deploying data applications\n* Fluency in at least one, and ideally more than one, of these languages: Java/Scala/Kolin, Python, Go, Rust, or C++\n* Good understanding of following concepts: partitioning, replication, map-reduce, indexing, and CAP\n* Experience with distributed storage systems (S3, HDFS, Hive, ClickHouse, Elastic, etc), distributed processing engines (Spark, etc), and message queues (Kafka, SQS, etc)\n* Passion for building large-scale ML applications and improving software engineers' productivity\n* Some understanding of key concepts in natural language processing, machine learning, or statistical analysis\n* Some experience with machine learning stack (pandas, PyTorch, numpy, sci-kit, transformers, etc)\n\n\n\nWhat we offer ๐\n* Unlimited PTO\n* Competitive salary and equity\n* Work-life balance\n* Flexibility to be fully or partly remote\n* Few meetings, so you can ship fast and focus on building\n* One Medical membership on us!\n* Top-notch medical, dental, vision, short-term disability, long-term disability, and life insurance\n* All insurance is 100% company-paid ($0 premiums) for employees and highly subsidized for dependants\n* FSA, HSA with company contributions, and pre-tax commuter benefits\n* 401(k) plan\n* Paid parental leave ( up to 12 weeks)\n\n\n\n\n\nThe salary and equity range for this role are: $170K - $210K and .15% - .35%.\n\n\nDonโt meet every single requirement? At Trunk, we are dedicated to building a diverse and inclusive workplace, so if youโre excited about this role but your past experience doesnโt align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.\n\n\nIf you need assistance or an accommodation due to a disability, we're happy to help accommodate. Please contact us at [email protected]. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco or Remote
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nItโs an exciting time to join Metrika! A Series A-funded startup in growth-mode with teammates across the US, Canada, UK, and Europe, we are building the world's premier operational intelligence platform for blockchain networks. Metrika partners with blockchain protocols, foundations, and node runners to help them and their community members analyze individual and network-wide metrics of their Distributed Ledger Technology (DLT) networks to maintain and improve their performance, security, and reliability.\n\n\n\nThese are the early days of our platform, and as a Senior Data Engineer you will be able to contribute, influence, and take ownership in significant parts of our systems. Our goal is to build a very high performance platform, capable of analyzing thousands of transactions across multiple blockchain networks in real-time.\n\n\n\nIf you are a Senior Data Engineer, with a solid understanding of data lakes, data warehouses, ETL, distributed systems, passion for your work and would love to work with a geographically distributed team, in an emerging industry join us! No prior experience in blockchain necessary, but an interest in learning and being deeply immersed is.\n\n\n\nWhat you'll be doing:\n\n\n\n* \nDesigning, implementing and maintaining data processing pipelines โ this includes ingestion, clean up, transformation, aggregation, batch and streaming jobs, as well managing the data lifecycle to ensure affordable and performant long-term storage across our data stores and data lake. You will work closely with our software engineers, SREs and our Analytics team to make sure data smoothly flows across Metrika and beyond to our customers and users.\n\n* \nWorking under a Scrum or Kanban framework.\n\n* \nOwning your work. This means being proud of your work, actively striving for excellence, observing the best practices of your craft and always aiming to improve your skill.\n\n* \nUnderstanding, participating and contributing to the company goals, regardless of your role. Metrika is a small company with a very inclusive culture. We are looking for people that share those values with us.\n\n\n\n\n\n\nPlease note: Our Engineering team is predominantly based in Europe and the eastern United States. This position is currently open to those resident and currently able to work in the European Economic Area (EU, Norway, Liechtenstein), Switzerland, the UK as well the eastern United States/Canada (UTC-4/UTC-5 timezone).\n\n\n\n\nMetrika Inc. is an Equal Opportunity employer. All applicants will be considered without regard for race, color, national origin, ethnicity, gender, disability, sexual orientation, gender identity, or religion.\n\n\nWe are looking for individuals with:\n\n* \nA Bachelor's degree in Computer Science, Electrical Engineering, Physics or Mathematics. Masters or higher degrees preferred.\n\n* \nMulti-year experience in data engineering, in large-scale production environments.\n\n* \nAt Metrika we mostly use Python for data processing; most of our ETL/Data processing jobs are written in Python. You will need to have some familiarity with scheduling systems (e.g. Airflow, luigi etc.), data transformation tools (e.g. dbt), distributed compute frameworks (e.g. Apache Spark, Apache Flink, ray.io etc.), and a solid understanding of the concepts of data governance, data lineage/provenance.\n\n* \nExcellent understanding of TDD, agile development methodology and version control.\n\n* \nThe ability to function autonomously to solve problems, and deliver working software. Our remote environment and geographic distribution requires people that can work well on their own.\n\n* \nThe ability to communicate well with your team, both interactively and asynchronously, and that of being a positive, constructive team member.\n\n\n\n\n\n\nYou'll be a great fit if you have:\n\n* \nWorked and contributed to a Big Data production environment, handling multiple GB of data per day.\n\n* \nGood knowledge of Python.\n\n* \nExperience with Apache Spark, Apache Flink, Ray.io and Airflow,\n\n* \nExperience with using and building CI/CD pipelines\n\n* \nExperience with Docker/Kubernetes or Serverless environments.\n\n* \nExperience with SQS/SNS, Apache Kafka, RabbitMQ or other brokers.\n\n* \nExperience with public cloud providers, e.g. AWS, GCP, Azure, DigitalOcean etc.\n\n* \nExperience with blockchain systems.\n\n\n\nOnce you submit your application, you will receive an automated email from the recruitee.com domain within a few minutes acknowledging we have received your application. If you do not receive this email within a few minutes, please check your spam folder or other filtered folders. And to ensure our future communications reach you, please add emails from the recruitee.com domain to your safe list. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Serverless, Cloud, Node, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote job
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nOur crypto wallet is used by millions of people and provides a single, convenient solution for managing all your accounts and tokens across Solana, Ethereum, and Polygon. As a data engineer at Phantom you will see a direct correlation between your work, company growth, and our usersโ satisfaction. Beyond this, you will work with some of the brightest minds in the web3 space, and you'll have a unique opportunity to solve some of the most interesting data challenges with efficiency and integrity, at a scale few web3 companies can match.\n\n\n* Snowflake / dbt / Rudderstack / Real-time Data Streaming / Python / SQL\n\n\n\nResponsibilities\n\n\n* \nArchitect: Design, build and launch extremely efficient and reliable data pipelines (ETL) to move data across a number of platforms including third party analytics, frontend & backend systems.\n\n* \nEducate teams: Use your data and analytics experience to see whatโs missing, and identify and address gaps in their existing systems and processes.\n\n* \nPartnership: Partner with stakeholders to understand business requirements, work with cross-functional data and products teams and build efficient and scalable data solutions\n\n* \nData: Manage the delivery of high-impact dashboards, tools, and data visualizations\n\n\n\nQualifications\n\n\n* 5+ years of experience in SQL or similar languages, and development experience in at least one language (Python, Javascript, etc.)\n\n* 5+ years of experience in the data warehouse space, custom ETL design, implementation, and maintenance.\n\n* Experience in leading data-driven projects from definition through interpretation and execution\n\n* Experience with data architecture, data modeling, schema design, and software development\n\n* Experience working with cloud analytics platforms and tools, specifically Snowflake, dbt, and Rudderstack\n\n* Bonus: Experience with blockchain or cryptocurrencies, real-time data streaming, and startup environments.\n\n* This role is fully remote; however, weโre only open to candidates based in US and EU time zones.\n\n\n\n\nThe target base salary for this role will range between $150,000 to $250,000 with the addition of equity and benefits. This is determined by a few factors including your skillset, prior relevant experience, quality of interviews and market factors at the point in time of offer. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Crypto, Python, Cloud, Senior, Engineer and Backend jobs that are similar:\n\n
$70,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
What this job can offer you\n\nThis is an exciting time to join the growing Data Team at Remote, which today consists of over 15 Data Engineers, Analytics Engineers and Data Analysts spread across 10+ countries. Throughout the team we're focused on driving business value through impactful decision making. We're in a transformative period where we're laying the foundations for scalable company growth across our data platform, which truly serves every part of the Remote business. This team would be a great fit for anyone who loves working collaboratively on challenging data problems, and making an impact with their work. We're using a variety of modern data tooling on the AWS platform, such as Snowflake and dbt, with SQL and python being extensively employed.\n\nThis is an exciting time to join Remote and make a personal difference in the global employment space as a Senior Data Engineer, joining our Data team, composed of Data Analysts and Data Engineers. We support the decision making and operational reporting needs by being able to translate data into actionable insights to non-data professionals at Remote. Weโre mainly using SQL, Python, Meltano, Airflow, Redshift, Metabase and Retool.\nWhat you bring\n\n\n* 5+ years of experience in data engineering; high-growth tech company experience is a plus\n\n* Strong experience with building data extraction/transformation pipelines (e.g. Meltano, Airbyte) and orchestration platforms (e.g. Airflow)\n\n* Strong experience in working with SQL, data warehouses (e.g. Redshift) and data transformation workflows (e.g. dbt)\n\n* Solid experience using CI/CD (e.g. Gitlab, Github, Jenkins)\n\n* Experience with data visualization tools (e.g. Metabase) is considered a plus\n\n* A self-starter mentality and the ability to thrive in an unstructured and fast-paced environment\n\n* You have strong collaboration skills and enjoy mentoring\n\n* You are a kind, empathetic, and patient person\n\n* Writes and speaks fluent English\n\n* It's not required to have experience working remotely, but considered a plus\n\n\n\nKey Responsibilities\n\n\n* Playing a key role in Data Platform Development & Maintenance:\n\n\n* Managing and maintaining the organization's data platform, ensuring its stability, scalability, and performance.\n\n* Collaboration with cross-functional teams to understand their data requirements and optimize data storage and access, while protecting data integrity and privacy.\n\n* Development and testing architectures that enable data extraction and transformation to serve business needs.\n\n\n\n\n\n* Improving further our Data Pipeline & Monitoring Systems:\n\n\n* Designing, developing, and deploying efficient Extract, Load, Transform (ELT) processes to acquire and integrate data from various sources into the data platform.\n\n* Identifying, evaluating, and implementing tools and technologies to improve ELT pipeline performance and reliability.\n\n* Ensuring data quality and consistency by implementing data validation and cleansing techniques.\n\n* Implementing monitoring solutions to track the health and performance of data pipelines and identify and resolve issues proactively.\n\n* Conducting regular performance tuning and optimization of data pipelines to meet SLAs and scalability requirements.\n\n\n\n\n\n* Dig deep into DBT Modelling:\n\n\n* Designing, developing, and maintaining DBT (Data Build Tool) models for data transformation and analysis.\n\n* Collaboration with Data Analysts to understand their reporting and analysis needs and translate them into DBT models, making sure they respect internal conventions and best practices.\n\n\n\n\n\n* Driving our Culture of Documentation:\n\n\n* Creating and maintaining technical documentation, including data dictionaries, process flows, and architectural diagrams.\n\n* Collaborating with cross-functional teams, including Data Analysts, SREs (Site Reliability Engineers) and Software Engineers, to understand their data requirements and deliver effective data solutions.\n\n* Sharing knowledge and offer mentorship, providing guidance and advice to peers and colleagues, creating an environment that empowers collective growth\n\n\n\n\n\n\n\nPracticals\n\n\n* You'll report to: Engineering Manager - Data\n\n* Team: Data \n\n* Location: For this position we welcome everyone to apply, but we will prioritise applications from the following locations as we encourage our teams to diversify; Vietnam, Indonesia, Taiwan and South-Korea\n\n* Start date: As soon as possible\n\n\n\nRemote Compensation Philosophy\n\nRemote's Total Rewards philosophy is to ensure fair, unbiased compensation and fair equity pay along with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labor practices and therefore we ensure to pay above in-location rates. We hope to inspire other companies to support global talent-hiring and bring local wealth to developing countries.\n\nAt first glance our salary bands seem quite wide - here is some context. At Remote we have international operations and a globally distributed workforce. We use geo ranges to consider geographic pay differentials as part of our global compensation strategy to remain competitive in various markets while we hiring globally.\n\nThe base salary range for this full-time position is $49,650 USD to 111,700 USD. Our salary ranges are determined by role, level and location, and our job titles may span more than one career level. The actual base pay for the successful candidate in this role is dependent upon many factors such as location, transferable or job-related skills, work experience, relevant training, business needs, and market demands. The base salary range may be subject to change.\nApplication process\n\n* Interview with recruiter\n\n* Interview with future manager\n\n* Async exercise stage \n\n* Interview with team members\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Testing, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRio de Janeiro, Rio de Janeiro, Brazil
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nNansen is a blockchain analytics platform that enriches on-chain data with millions of wallets labels. Crypto investors use Nansen to discover opportunities, perform due diligence and defend their portfolios with our real-time dashboards and alerts. \n\nNansen is a blockchain analytics platform that enriches on-chain data with millions of wallet labels. Crypto investors use Nansen to discover opportunities, perform due diligence, and defend their portfolios with our real-time dashboards and alerts.\n\nThe successful candidate will preferably be located in the EMEA or APAC timezone.\n\nThe Opportunity:\n\nAs a Senior Data Engineer at Nansen you will play a key role in architecting and scaling our data infrastructure to meet the growing demands of crypto investors. Your work will directly impact our product's performance, reliability, and capability, ensuring that solidify Nansen's position as a leader in blockchain analytics.\n\nYou will work with cutting-edge technologies and tackle some of the most challenging and exciting problems in data at the intersection of finance and technology.\n\nKey Responsibilities:\n\n\n* Lead the development and optimization of our data pipelines, databases and systems for serving data to our customers, ensuring scalability, efficiency, and reliability.\n\n* Work in close collaboration with product engineers and analysts to design and implement robust data models\n\n* Drive innovation by staying updated with the latest in data engineering practices, tools, and technologies, applying them to solve complex business and data challenges.\n\n* Mentor and guide junior data engineers, fostering a culture of excellence, continuous learning and knowledge sharing within the team.\n\n* Work closely with cross-functional teams to align data engineering initiatives with business objectives and customer needs.\n\n\n\n\nRequirements:\n\n\n* A proven track record of building and scaling high-performance data systems\n\n* Excellent SQL and Python skills, familiarity with dbt and JS\n\n* Experience with modern cloud-based database technologies for both batch and streaming workloads (e.g. ClickHouse, BigQuery)\n\n* An ability to work full-stack with data; software engineering and data visualisation skills a big plus\n\n* Excellent communication skills, with the ability to collaborate effectively in a remote work environment across multiple time zones.\n\n* Strong interest in database internals and streaming pipelines, as well as the logical organisation of information in data warehouses.\n\n* A passion for blockchain, crypto, and Web3 technologies.\n\n\n\n\nWhat We Offer:\n\n\n* Competitive salary and generous equity.\n\n* Remote work environment with a flexible schedule.\n\n* A team that values data and AI; our founders are data engineers and data scientists\n\n* The opportunity to work from our Tech Hubs (flight & accommodation paid for) in Singapore, London, Bangkok, & Lisbon.\n\n* A company culture that values transparency, speed, courage, and curiosity.\n\n* Opportunities for personal and professional growth as the company scales.\n\n* Exposure to a global network of industry experts, partners, and influencers.\n\n\n\n\n \n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Web3, Crypto, Python, Finance, Senior, Junior and Engineer jobs that are similar:\n\n
$70,000 — $115,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nEMEA/APAC, Remote
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Daniel J Edelman Holdings is hiring a Remote Senior Data Engineer
\nWe currently seeking a Senior Data Engineer with 5-7 yearsโ experience. The ideal candidate would have the ability to work independently within an AGILE working environment and have experience working with cloud infrastructure leveraging tools such as Apache Airflow, Databricks, DBT and Snowflake. A familiarity with real-time data processing and AI implementation is advantageous. \n\n\n\nResponsibilities:\n* Design, build, and maintain scalable and robust data pipelines to support analytics and machine learning models, ensuring high data quality and reliability for both batch & real-time use cases.\n* Design, maintain, optimize data models and data structures in tooling such as Snowflake and Databricks. \n* Leverage Databricks for big data processing, ensuring efficient management of Spark jobs and seamless integration with other data services.\n* Utilize PySpark and/or Ray to build and scale distributed computing tasks, enhancing the performance of machine learning model training and inference processes.\n* Monitor, troubleshoot, and resolve issues within data pipelines and infrastructure, implementing best practices for data engineering and continuous improvement.\n* Diagrammatically document data engineering workflows. \n* Collaborate with other Data Engineers, Product Owners, Software Developers and Machine Learning Engineers to implement new product features by understanding their needs and delivery timeously. \n\n\n\nQualifications:\n* Minimum of 3 years experience deploying enterprise level scalable data engineering solutions.\n* Strong examples of independently developed data pipelines end-to-end, from problem formulation, raw data, to implementation, optimization, and result.\n* Proven track record of building and managing scalable cloud-based infrastructure on AWS (incl. S3, Dynamo DB, EMR). \n* Proven track record of implementing and managing of AI model lifecycle in a production environment.\n* Experience using Apache Airflow (or equivalent) , Snowflake, Lucene-based search engines.\n* Experience with Databricks (Delta format, Unity Catalog).\n* Advanced SQL and Python knowledge with associated coding experience.\n* Strong Experience with DevOps practices for continuous integration and continuous delivery (CI/CD).\n* Experience wrangling structured & unstructured file formats (Parquet, CSV, JSON).\n* Understanding and implementation of best practices within ETL end ELT processes.\n* Data Quality best practice implementation using Great Expectations.\n* Real-time data processing experience using Apache Kafka Experience (or equivalent) will be advantageous.\n* Work independently with minimal supervision.\n* Takes initiative and is action-focused.\n* Mentor and share knowledge with junior team members.\n* Collaborative with a strong ability to work in cross-functional teams.\n* Excellent communication skills with the ability to communicate with stakeholders across varying interest groups.\n* Fluency in spoken and written English.\n\n\n\n\n\n#LI-RT9\n\n\nEdelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics and data consultancy with a distinctly human mission.\n\n\nWe use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting and connections more meaningful.\n\n\nDXI brings together and integrates the necessary people-based PR, communications, social, research and exogenous data, as well as the technology infrastructure to create, collect, store and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.\n\n\nTo learn more, visit: https://www.edelmandxi.com \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, DevOps, Cloud, Senior, Junior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Torc Robotics is hiring a Remote Senior Business Intelligence Analyst
\nMeet The Team:\n\nThe Senior Business Intelligence Analyst must be able to translate business value goals into user-centered dashboards and visualizations and underlying data models that support decision-making while using best practices for user interaction and interface design. They must be comfortable starting with ill-defined datasets and problems and helping stakeholders to unpack what will be most helpful and achievable given the data available. They use statistical methods to test hypotheses and provide predictive modeling to uncover insights and assess data reliability for strategic planning and reporting. They work with data sets of various kinds and sizes in databases, spreadsheets, and data lakes, and must have strong SQL skills and be able to work in one or multiple BI frameworks depending on need (Power BI, Tableau, Looker, etc.). They must be comfortable working in an agile environment, have experience working with UIs that make data accessible and consumable, enjoy the challenge of highly complex technical contexts, and above all, is passionate about data and analytics.\n\nWhat Youโll Do: \n\n\n* Develop and implement novel solutions to monitor and improve business performance and efficiency starting from gathering requirements from stakeholders to partnering with data producing teams and other analysts to build and operationalize functional systems.\n\n* Define, instrument, and maintain new and existing business KPIs from scratch, leveraging SQL queries and Python scripting.\n\n* Build automation to ensure we can understand and act on business performance with accurate, scalable, and low latency solutions that require minimal maintenance.\n\n* Create actionable, user-friendly, and intuitive dashboards using Tableau or other tools - to surface business performance and support scalable data-driven decision making.\n\n* Ensure transparency of data quality by clearly documenting data sources and data validation processes and regularly auditing the results.\n\n* Work with IT staff to design and implement secure and sustainable paths for data access from systems to support timely dashboards.\n\n* Use out-of-the-box thinking to develop and implement solutions to unique challenges.\n\n* Create documentation and provide training to inform stakeholders and cultivate data literacy within the organization.\n\n* Answer complex business questions via ad-hoc SQL queries or Python notebooks to enable business owners to make critical decisions.\n\n\n\n\nA successful candidate will be:\n\n\n* Data-driven and highly analytical - you're able to translate data and insights into a meaningful story to drive strategy, action, and decision-making at an executive level.\n\n* Self-starter- you boldly pursue your work and love the responsibility of being personally empowered, constantly prioritizing your work amongst several projects.\n\n* Curious- you're innovative, creative, and constantly looking for opportunities to tweak and optimize.\n\n* Team Player- you love going heads down in SQL or Pandas, but also partnering with your teammates and stakeholders.\n\n\n\n\nWhat Youโll Need to Succeed: \n\n\n* Degree in Data Analysis, Statistics, Information Technology, Computer Science or related field.\n\n\n\n* BS+ 10+ years of experience OR MS+ 7+ years of experience OR PHD 5+ years of experience\n\n\n\n* 5+ years of experience in working with databases, business analytics, and large-scale data systems.\n\n* 5+ years of experience with database design, data modeling, or object modeling.\n\n* 5+ years of experience in Business Intelligence/data analytics tools (Microsoft Power BI, Dashboards, SQL, Tableau, etc.).\n\n* Analytical thinking with a data-driven approach to problem solving and the ability to extract value from data and connect insights to business outcomes.\n\n* Ability to work under pressure and multi-task in a fast-paced/rapidly changing environment, comfortable with ambiguity.\n\n* Outstanding attention to detail, strong analytical and communication skills, comfortable working independently, strong listener, innovator, and the capacity to work under pressure to meet tight deadlines.\n\n* A team player with a professional โget it doneโ attitude.\n\n* Proven history of living the values important to Torc: Integrity, Accountability, Respect, Innovation, Success\n\n\n\n\nBonus Points:\n\n\n* Ability to communicate complicated concepts and recommended courses of action to engage internal stakeholders.\n\n* Technology background with proven ability to quickly understand complex technology subjects.\n\n* Highly customer-focused, business-oriented, and objective-driven.\n\n* Ability to successfully manage multiple priorities and projects.\n\n\n\n\nPerks of Being a Full-time Torcโr\n\nTorc cares about our team members and we strive to provide benefits and resources to support their health, work/life balance, and future. Our culture is collaborative, energetic, and team focused. Torc offers: \n\n\n* A competitive compensation package that includes a bonus component and stock options\n\n* 100% paid medical, dental, and vision premiums for full-time employees \n\n* 401K plan with a 6% employer match\n\n* Flexibility in schedule and generous paid vacation (available immediately after start date)\n\n* Company-wide holiday office closures\n\n* AD+D and Life Insurance \n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Match Group is hiring a Remote Senior Data Analyst
\nThe growing Match Group Central Platforms team is looking for an analyst to work cross-functionally with the Match Group organization, and our portfolio of brands, to help identify, understand, and improve business, operations, and product challenges. Reporting to the Director of Product, the Senior Data Analyst will deliver high-quality strategic and tactical analysis; collaborating with the product, engineering, data science, business intelligence, member experience and operations teams. \n\n\nKnow where you belong. Match Group is a leading provider of dating products across the globe. Our portfolio includes Tinder, Match, Hinge, Plenty of Fish, The League, and others, each designed to spark meaningful connections for singles worldwide. Creating a sense of belonging doesnโt stop at our products - itโs the foundation of every team we hire.\n\n\nWe are flexible and offer full remote or hybrid working models as options to accommodate our team. Match Group is headquartered in Dallas, TX, with offices in LA, Palo Alto, and New York. \n\n\n\n\n\nIn this role you will: \n* Own defining, tracking, and monitoring business KPIs for the Central Platforms Product team.\n* Partner with product managers to create hypotheses, define quantitative goals, and design experiments for roadmap projects\n* Partner with engineering to improve data collection through data infrastructure design, data pipelining, and warehousing solutions\n* Collaborate with cross-functional teams, including business stakeholders, developers, and data engineers, to identify data requirements, gather feedback, design and implement solutions.\n* Conduct exploratory data analysis to discover patterns and insights to inform future strategy. \n* Gather, compile, and extract meaningful insights from complex datasets, using statistical techniques and machine learning algorithms when applicable.\n* Summarize and evangelize your findings across the company; providing Executives, Product Managers, Engineering teams, and other Data Analysts with deep-dive quantitative analyses in an immediately usable format, extracting key insights from extensive, complex data sets\n* Proactively find opportunities to take analytics framework into the next level and lead stakeholders to adopt new frameworks and methodologies\n* Take ownership of ambiguous and complex analytical assignments\n\n\n\nWe could be a Match if you have: \n* 5+ years of progressive experience using quantitative analysis to make business-focused recommendations\n* 3+ years of analytics experience working directly with product managers or embedded within a product team\n* Experience with data warehousing, data pipelining solutions a plus\n* Familiarity with machine learning concepts and algorithms\n* Proven technical data analytics skills in working with large amounts of data at different levels of aggregation, from raw telemetry to metric stores. \n* Working knowledge of trust and safety analytics metrics (fraud signals, bad actor patterns, etc.)\n* Strong understanding of incrementally as well as sizing business problems and opportunities\n* Understanding of basic statistical concepts such as confidence interval, normal distribution, correlation, linear regression, and multivariate testing\n* Strong communication skills at different levels of organizational hierarchy\n* Proactive skill set that encompasses problem identification, analysis, solution definition, results, and communication\n* Proficiency with SQL, R and Python (or equivalent)\n* In depth experience with data visualization and analytical tools (E.g. Tableau, Power BI, Looker)\n\n\n\n\n$118,000 - $141,500 a yearFactors such as scope and responsibilities of the position, candidate's work experience, education/training, job-related skills, internal peer equity, as well as market and business considerations may influence base pay offered. This salary range is reflective of a position based in Los Angeles, CA. This salary will be subject to a geographic adjustment (according to a specific city and state), if an authorization is granted to work outside of the location listed in this posting.\n\n#LI-CENTRAL\n\n\n#LI-REMOTE\n\n\n#LI-CH1\n\n\nWhy Match Group?\n\n\nOur mission is simple โ to help people find love and happiness! We love our employees too and understand the importance of all life's milestones. Here are some of the benefits we are proud to offer: \n\n\nMind & Body โ Medical, mental health, and wellness benefits to support your overall health and well-being\nFinancial Wellness โ Competitive compensation, 100% employer match on 401k contributions up to 10% (cap at $10,000), as well as an employee stock purchase program to help you feel supported in your financial security\nUnplug โ Generous PTO and 18 paid holidays so you can unplug\nCareer โ Annual training allowance for professional development and ERG membership opportunities and events so you feel connected and empowered in your work\nFamily โ Families come in all shapes and sizes so we offer 20 weeks of 100% paid parental leave, fertility, adoption, and child care resources, as well as pet insurance and discounts \nCompany Gatherings โ We host fun happy hours and company events where our employees get to know each other and build a sense of connection and belonging!\n\n\nWe are proud to be an equal opportunity employer and we value the rich dynamics that diversity brings to our company. We do not discriminate on the basis of race, religion, color, creed, national origin, ancestry, disability, marital status, age, sexual orientation, sex (including pregnancy and sexual harassment), gender identity or expression, uniformed service or veteran status, genetic information, or any other legally protected characteristic. Period. \n\n\nIf you require reasonable accommodation to complete a job application, pre-employment testing, or a job interview or to otherwise participate in the hiring process, please contact [email protected]. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Embedded, Python and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
HopSkipDrive is hiring a Remote Senior Data Analyst In Ride Experience
\nWho you are\n\nReporting to the Director of In-Ride Experience, the ideal candidate has a strong foundation in data analytics and experimentation, has a go-getter attitude and the drive to advance their career in a rapidly growing technology company. You think like a business owner. You are excited to jump in and help your teammates, and youโre always thinking of ways to constructively challenge the status quo and improve the business. You are an A+ player who is inspired by the mission we are pursuing and by the opportunity to define a new category in an incredibly fast-moving market. You are excited to bring your best everyday, to learn from those around you and to push hard while contributing to our powerful vision of positively impacting kidsโ lives everyday.\n\nAt HopSkipDrive, we know that to tackle our toughest challenges, we need different approaches, unique perspectives, and new ways of thinking. We are building a team of creative problem-solvers from many different backgrounds.\n\n \n\nYour Role:\n\nAs an Analyst you will empower people in the company to take informed actions. Deliverables will be what we call data products, but also thought leadership, and troubleshooting. You will help stakeholders enhance their data product portfolio. \n\nSpecific day-to-day responsibilities:\n\n\n* Lead data analyses for In-Ride Experience and Safety projects from scoping to delivery.\n\n* Analyze trends and identify opportunities for In-Ride Experience and Safety improvements.\n\n* Surface actionable insights and recommendations using data visualizations and presentations to a range of stakeholders.\n\n* Develop tools from actionable Tableau dashboards to predictive models with attention to usability.\n\n* Gather, connect and analyze data from a variety of sources to improve the effectiveness of initiatives aiming to improve the in-ride experience and safety.\n\n* Collaborate with product and operations stakeholders to design A/B and multivariate tests, and report on performance, attribution, and effectiveness.\n\n* Proactively investigate data discrepancies and implement measures to uphold data quality and accuracy across all analyses.\n\n* Partner with data producers to drive initiatives for continuous data improvement and refinement.\n\n* Evaluate existing processes and dashboards pertaining to In-Ride Experience and Safety, and offer recommendations for enhancements in tools, metric inputs, and projected outcomes.\n\n\n\n\n\n\nSkills and Experience:\n\n\n* 5+ years of hands-on data analytics experience with direct stakeholder interactions across data science or business intelligence engineering teams within the tech space\n\n* Applied statistics and mathematics skills: hypothesis testing, regression, classification, optimization, experimental design, or A/B testing\n\n* Experience working with product experimentation analyses\n\n* Strong experience developing solutions using Tableau, SQL (including performance tuning), and Excel/Sheets\n\n* Experience with data quality and developing auditable data products.\n\n* Experience building solutions with data collection, integration, and warehousing\n\n* Excellent verbal and written communication skills to tell data stories\n\n* Ability to thrive in a fast-paced work environment\n\n* Highly self-motivated learner, with attention to detail\n\n\n\n\n \n\n\nNice to have experience: \n\n\n\n\n\n* Statistical competence\n\n* Marketplace company experience\n\n* Familiarity with GitHub, ClickUp, DataGrip\n\n* Familiarity with data transformation, lineage, and low code tools\n\n* Experience with Python and Unix shell\n\n* Data governance\n\n\n\n\n \n\nWhat you will get\n\nWe want you to be an owner in our company and share in executing our vision, so every full-time employee has equity. In addition, we offer flexible vacation, medical, dental, vision and life insurance, 401(k), FSA, and an opportunity to work for a uniquely positioned, VC-backed company in a hugely attractive space with significant upside potential. HopSkipDrive is committed to fair and equitable compensation practices. The base salary range for this role is $100,000 - $115,000. This compensation will ultimately be in line with the location in which the position is filled. Final compensation for this role will be determined by several factors such as a candidateโs relevant work experience, skill set, certifications, and specific work location. The total compensation package for this role also includes equity stock options. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nDenver, Colorado, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\n \n\nPosition overview\n\nWe are seeking a Senior Data Analyst with experience in the financial domain to join us. This role's focus is on automating reporting and data processes, enabling advanced analytics for better financial decision-making, and developing data products for the Finance Team. Reporting to the Head of Data and working closely with the Finance Team, the ideal candidate requires both business acumen and technical expertise, playing an important contributing role in our financial operations and strategy.\n\nResponsibilities\n\n\n* Automate financial data processes to enhance efficiency, accuracy and reliability.\n\n* Implement predictive analytics based on statistical models to provide real-time, accurate insights on customer activity and financial data, leveraging BigQuery and Python.\n\n* Make data available to stakeholders, assist them in interpreting and using it for their daily tasks and provide actionable recommendations based on analyses.\n\n* Synthesise findings and communicate them in a practical and actionable manner through visualizations and storytelling.\n\n* Develop self-service data tools that allow the Finance team to generate custom reports and data extracts independently, reducing the dependency on ad-hoc requests.\n\n* Attend the reporting requirements of the organisation: from ad-hoc requests to recurring financial and regulatory reports.\n\n* Ensure the quality, reliability, and accuracy of all reports and analyses.\n\n* Continuously seek ways to improve data-related processes, including report automation, data quality assurance, and advanced analytics techniques.\n\n\n\n\nSkills needed\n\n\n* Advanced SQL and Python skills with proven experience in data analytics, particularly in a finance-focused role.\n\n* Strong data manipulation, structuring and wrangling skills coupled with practical experience with time series analysis and predictive analytics.\n\n* Proficiency in data visualization and analytics tools like BigQuery and Looker.\n\n* Strong business acumen with skills such as customer-centricity, stakeholder management, and collaboration.\n\n* Exceptional analytical, critical thinking, and problem-solving abilities.\n\n* A methodical and logical approach combined with accuracy and attention to detail.\n\n* Strong communication skills, both written and verbal.\n\n* Experience with GCP (Google Cloud Platform) products and services such as BigQuery, Cloud Functions, Vertex AI is a plus.\n\n* Familiarity with fintech or the crypto market is a plus.\n\n\n\n\nOther requirements\n\n\n* A dedicated workspace.\n\n* A reliable internet connection with the fastest speed possible in your area.\n\n* Devices and other essential equipment that meet minimal technical specifications.\n\n* Alignment with Our Values and the Xapo Values-Driven Leadership principles.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Crypto, Python, Finance, Cloud and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nGibraltar, Gibraltar
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nReports to: Director of Product and Insights\nPay rate: $60/hour on a 10-15 hour project basis\nDeadline for applications: April 14, 2024\n\nThe News Revenue Hub is seeking contract data analysts to aid in ongoing Google Analytics and Parse.ly data analysis. \n\nThese data analysts will help newsrooms work with their audience data, identifying and highlighting metrics that help them move the needle on audience growth, loyalty, donations, and other key performance metrics. Data analysts will be experts at configuring and gathering audience data from sources like Google Analytics and Parse.ly, and translating it into benchmarks, strategies, and actions. The opportunity to work with dozens of newsrooms is one of many unique benefits to working with the Hub. \n\nThe News Revenue Hub is a mission-driven 501(c)3 nonprofit organization dedicated to helping news organizations achieve greater financial sustainability. We help our members implement fundraising models and audience strategies to cultivate sustained, loyal readership.\n\nAs a data analyst, your chief responsibilities will include:\n\n\n* Pull and clean site traffic data by audience segment, referrer source, and site sections to identify trends and patterns from Parse.ly and Google Analytics\n\n* Create traffic and loyalty analyses on audience behavior and engagement trends to help inform newsrooms' editorial and audience strategies\n\n* Craft recommendations for newsroom leadership on where to further invest time and resources in to boost organizational goals\n\n* Analyze topic and category performance by audience segment and reader behavior to identify common behavioral trends\n\n* Transform data into deliverables such as reports and slide decks for nontechnical audiences\n\n\n\n\nThe ideal candidate will have:\n\n\n* Proven experience with Parse.ly, Google Analytics (or similar) and data analysis using Google Sheets or Excel\n\n* Experience and familiarity pulling and cleaning website and content analytics data for granular analysis \n\n* Understand the business logic of how newsrooms sort and categorize traffic data by article and site sections \n\n* A high degree of integrity and discretion; must be willing to sign a nondisclosure agreement\n\n* Skill in analyzing and visualizing data, drawing actionable insights, and communicating strategic tactics informed by data to non-technical audiences\n\n* Experience working in a newsroom and communicating with stakeholders like editors and senior newsroom leadership \n\n* A passion for being analytical and detail oriented on not just reporting data, but also surfacing themes around key trends and results\n\n\n\n\nAdditional skills desired, though not required:\n\n\n* Preferred data management and programming skills include Python and SQL\n\n* Experience with data migrations and QA is helpful but not required\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nReports to: Director of Product and Insights\nPay rate: $60/hour on a 10-15 hour project basis\nDeadline for applications: April 14, 2024\n\nThe News Revenue Hub is seeking contract data analysts to aid in ongoing Google Analytics and Parse.ly data analysis. \n\nThese data analysts will help newsrooms work with their audience data, identifying and highlighting metrics that help them move the needle on audience growth, loyalty, donations, and other key performance metrics. Data analysts will be experts at configuring and gathering audience data from sources like Google Analytics and Parse.ly, and translating it into benchmarks, strategies, and actions. The opportunity to work with dozens of newsrooms is one of many unique benefits to working with the Hub. \n\nThe News Revenue Hub is a mission-driven 501(c)3 nonprofit organization dedicated to helping news organizations achieve greater financial sustainability. We help our members implement fundraising models and audience strategies to cultivate sustained, loyal readership.\n\nAs a data analyst, your chief responsibilities will include:\n\n\n* Pull and clean site traffic data by audience segment, referrer source, and site sections to identify trends and patterns from Parse.ly and Google Analytics\n\n* Create traffic and loyalty analyses on audience behavior and engagement trends to help inform newsrooms' editorial and audience strategies\n\n* Craft recommendations for newsroom leadership on where to further invest time and resources in to boost organizational goals\n\n* Analyze topic and category performance by audience segment and reader behavior to identify common behavioral trends\n\n* Transform data into deliverables such as reports and slide decks for nontechnical audiences\n\n\n\n\nThe ideal candidate will have:\n\n\n* Proven experience with Parse.ly, Google Analytics (or similar) and data analysis using Google Sheets or Excel\n\n* Experience and familiarity pulling and cleaning website and content analytics data for granular analysis \n\n* Understand the business logic of how newsrooms sort and categorize traffic data by article and site sections \n\n* A high degree of integrity and discretion; must be willing to sign a nondisclosure agreement\n\n* Skill in analyzing and visualizing data, drawing actionable insights, and communicating strategic tactics informed by data to non-technical audiences\n\n* Experience working in a newsroom and communicating with stakeholders like editors and senior newsroom leadership \n\n* A passion for being analytical and detail oriented on not just reporting data, but also surfacing themes around key trends and results\n\n\n\n\nAdditional skills desired, though not required:\n\n\n* Preferred data management and programming skills include Python and SQL\n\n* Experience with data migrations and QA is helpful but not required\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python and Senior jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
dbt Labs is hiring a Remote Senior Software Engineer ll Multi Cell
\nOver the past year we have built a next-generation multi-cell architecture and we are looking for a Senior Software Engineer ll to join the team and help add capabilities and migrate customers to the new deployments.\n\nThe cell-based architecture is a large-scale, worldwide distributed system, and this team has an outsize impact on every customer of dbt Labs. Today we serve some of the largest data-driven organizations in the world, enabling them to make decisions based on the knowledge at the core of their business. The quality, reliability, and performance our multi-cell implementation equates to leverage for analysts, analytics engineers, and data engineers in organizations of all shapes and sizes.\nIn this role, you can expect to:\n\n\n* Build cell-based application architecture that reliably and performantly delivers dbt Cloud to customers worldwide. You will work on a variety of technologies and features including our regional service layer, enabling self-service accounts across regions, cell migrations and product security.\n\n* Collaborate with multiple engineering teams, Product Management, Security, and Customer Support.\n\n* Work with a variety of programming languages, systems, and technologies, including: Golang, Python,+++ Postgres, Kubernetes, Terraform, Auth0, and Datadog.\n\n* Drive scaling and automation initiatives.\n\n* Define tradeoffs and make decisions about what, how, and when we build. We are a fast-moving startup and building the right platform at the place where application and infrastructure meet unlocks reliability, quality, and productivity for the long term.\n\n\n\nQualifications:\n\n\n* Have 7+ years experience in software engineering, including production experience supporting SaaS applications.\n\n* Minimum requirement of Bachelors degree in related field (computer science, computer engineering, etc.) OR\n\n* Completed enrollment in engineering related bootcamp.\n\n\n\nYou are a good fit if you:\n\n\n* Have implemented large-scale distributed systems and have a deep interest in application performance, scalability, reliability, and operability.\n\n* Have designed and built cloud applications that include containerized workloads, Python or Golang, and at least some of our technology stack. You donโt need to be experienced with every technology we use today.\n\n* Have a systematic problem-solving approach coupled with strong communication skills and a sense of ownership and drive.\n\n* Ensure high programming standards in your team by writing unit, functional, and integration tests and participating in timely, constructive code review.\n\n* Comfortable operating in fast paced environment the emphasizes making small changes to rapidly iterate, learn and deliver.\n\n* You are interested in our mission and values. You are inspired to drive progress in the data and analytics ecosystem.\n\n\n\nYou'll have an edge if you:\n\n\n* Have excellent written communication skills. We are a remote-first company that uses writing to facilitate decision-making.\n\n* Have experience with technical leadership.\n\n\n\nCompensation and Benefits:\n\n\n* Salary: $180,000-$235,000 USD\n\n* Equity Stake*\n\n* Benefits - dbt Labs offers:\n\n\n* Unlimited vacation (and yes we use it!)\n\n* 401k w/3% guaranteed contribution\n\n* Excellent healthcare\n\n* Paid Parental Leave\n\n* Wellness stipend\n\n* Home office stipend, and more!\n\n\n\n\n\n\n\n\nWhat to expect in the hiring process (all video interviews unless accommodations are needed):\n\n\n* Interview with a Talent Acquisition Partner \n\n* Technical Interview with Hiring Manager\n\n* Team Interviews \n\n* Final interview with leadership team member\n\n\n\n\n#LI-RC1\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS, Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nWe're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving over a million Hims & Hers users.\nYou Will:\n\n\n* Architect and develop data pipelines to optimize performance, quality, and scalability\n\n* Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources\n\n* Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake\n\n* Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance\n\n* Orchestrate sophisticated data flow patterns across a variety of disparate tooling\n\n* Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics\n\n* Partner with the rest of the Data Platform team to set best practices and ensure the execution of them\n\n* Partner with the analytics engineers to ensure the performance and reliability of our data sources\n\n* Partner with machine learning engineers to deploy predictive models\n\n* Partner with the legal and security teams to build frameworks and implement data compliance and security policies\n\n* Partner with DevOps to build IaC and CI/CD pipelines\n\n* Support code versioning and code deployments for data Pipelines\n\n\n\nYou Have:\n\n\n* 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages\n\n* Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed\n\n* Demonstrated experience writing complex, highly optimized SQL queries across large data sets\n\n* Experience with cloud technologies such as AWS and/or Google Cloud Platform\n\n* Experience with Databricks platform\n\n* Experience with IaC technologies like Terraform\n\n* Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres\n\n* Experience building event streaming pipelines using Kafka/Confluent Kafka\n\n* Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker\n\n* Experience with containers and container orchestration tools such as Docker or Kubernetes\n\n* Experience with Machine Learning & MLOps\n\n* Experience with CI/CD (Jenkins, GitHub Actions, Circle CI)\n\n* Thorough understanding of SDLC and Agile frameworks\n\n* Project management skills and a demonstrated ability to work autonomously\n\n\n\nNice to Have:\n\n\n* Experience building data models using dbt\n\n* Experience with Javascript and event tracking tools like GTM\n\n* Experience designing and developing systems with desired SLAs and data quality metrics\n\n* Experience with microservice architecture\n\n* Experience architecting an enterprise-grade data platform\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Docker, Testing, DevOps, JavaScript, Cloud, API, Senior, Legal and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nWe are seeking Data Engineers with a passion for sports to develop cloud-based data pipelines and automated data processing for our world-class sports intelligence platforms in baseball, basketball, cricket, eSports, football (American), golf, hockey, soccer, and tennis. Through your work, you can support the professional teams in our exclusive partner network in their efforts to compete and win championships. \n\nZelus Analytics is a fully remote company working directly with teams across the NBA, MLB, NFL, IPL and NHL, in addition to a number of soccer teams around the globe. Zelus unites a fast-growing startup environment with a research-focused culture that embraces our core values of integrity, innovation, and inclusion. We pride ourselves on providing meaningful mentorship that offers our team the opportunity to develop and expand their skill sets while also engaging with the broader analytics community. In so doing, we hope to create a new path for a broader group of highly talented people to push the cutting edge of sports analytics.\n\nWe believe that a diverse team is vital to building the worldโs best sports intelligence platform. Thus, we strongly encourage you to apply if you identify with any marginalized community across race, ethnicity, gender, sexual orientation, veteran status, or disability. At Zelus, we are committed to creating an inclusive environment where all of our employees are enabled and empowered to succeed and thrive.\n\nAs Zelus employees advance in experience and level, they are expected to build on their competencies and expertise and demonstrate increasing impact, independence, and leadership within their roles.\n\nMore specifically, as a Zelus Data Engineer, you will be expected to:\n\n\n* Design, develop, document, and maintain the schemas and ETL pipelines for our internal sports databases and data warehouses\n\n* Implement and test collection, mapping, and storage procedures for secure access to team, league, and third-party data sources\n\n* Develop algorithms for quality assurance and imputation to prepare data for exploratory analysis and quantitative modeling\n\n* Profile and optimize automated data processing tasks\n\n* Coordinate with data providers around planned changes to raw data feeds\n\n* Deploy and maintain system and database monitoring tools\n\n* Collaborate and communicate effectively in a distributed work environment\n\n* Fulfill other related duties and responsibilities, including rotating platform support\n\n\n\n\nAdditionally, a Data Engineer II will be expected to:\n\n\n* Create data ingestion and integration workflows that scale and can be easily adapted to future use cases\n\n* Assess, provision, monitor, and maintain the appropriate infrastructure and tooling to execute data engineering workflows\n\n\n\n\nAdditionally, a Senior Data Engineer will be expected to:\n\n\n* Research, design, and test generalizable software architectures for data ingestion, processing, and integration and guide organizational adoption\n\n* Collaborate with data science to design and implement vendor-agnostic data models that support downstream modeling efforts\n\n* Lead team-wide implementation of data engineering standards\n\n* Effectively communicate complex technical concepts to both internal and external audiences\n\n* Provide guidance and technical mentorship for junior engineers \n\n* Assist with recruiting and outreach for the engineering team, including building a diverse network of future candidates\n\n\n\n\nAdditionally, a Senior Data Engineer II will be expected to:\n\n\n* Identify and implement generalizable strategies for infrastructure maintenance and data-related cost savings\n\n* Break down complex data engineering projects into actionable work plans including proposed task assignments with clear design specifications\n\n* Assist in defining data engineering standards for the organization\n\n\n\n\nA qualified Data Engineer candidate will be able to demonstrate several of the following and will be excited to learn the rest through the mentorship provided at Zelus:\n\n\n* Academic and/or industry experience in back-end software design and development\n\n* Experience with ETL architecture and development in a cloud-based environment\n\n* Fluency in SQL development and an understanding of database and data warehousing technologies\n\n* Proficiency with Python (preferred), Scala, and/or other data-oriented programming languages\n\n* Experience with automated data quality validation across large data sets\n\n* Familiarity working with Linux servers in a virtualized/distributed environment\n\n* Strong software-engineering and problem-solving skills\n\n\n\n\nA qualified Senior Data Engineer candidate will be able to demonstrate all of the above at a higher level of competency plus the following:\n\n\n* Expertise developing complex databases and data warehouses for large-scale, cloud-based analytics systems\n\n* Experience with task orchestration and workflow automation tools\n\n* Experience building and overseeing team-wide data quality initiatives\n\n* Experience adapting, retraining, and retooling in a rapidly changing technology environment\n\n* Desire and ability to successfully mentor junior engineers\n\n\n\n\nStarting salaries range from*:\n\n\n* $87,000 to $102,000 for Data Engineer\n\n* $102,000 to $118,000 for Data Engineer II\n\n* $118,000 to $136,000 for Senior Data Engineer\n\n* $136,000 to $160,000 for Senior Data Engineer II\n\n\n\n\n*Compensation paid in non-US currency will be in a comparable range adjusted by differences in total cost of employment.\n\nZelus has a fully distributed workforce, spanning multiple states and countries, with a formal process for establishing compensation equity across its global staff. In addition to competitive salaries, our full-time compensation packages include equity grants and comprehensive benefits, such as an annual incentive bonus plan, supplemental health, vision, and dental insurance, and flexible PTO, all of which allow us to attract and retain a world-class team.\n\nAs an equal opportunity employer, Zelus does not discriminate on the basis of race, ethnicity, color, religion, creed, gender, gender expression or identification, sexual orientation, marital status, age, national origin, disability, genetic information, military status, or any other characteristic protected by law. It is our policy to provide reasonable accommodations for applicants and employees with disabilities. Please let us know if reasonable accommodation is needed to participate in the job application or interview process.\n\nIn most jurisdictions, Zelus is an at-will employer; employment at Zelus is for an indefinite period of time and is subject to termination by the employer or the employee at any time, with or without cause or notice. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Senior, Engineer and Linux jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Octopus Energy Group is hiring a Remote Senior Full Stack Engineer
\nThe energy industry is undergoing the largest transformation since industrialisation at an unprecedented rate of change and we are positioning ourselves to be at the heart of that change. \n\n\nOur aim is to be the leading global provider of solutions that enable customers to release ยฃ30bn of value per annum from distributed energy resources (DERs). We are building a Software as a Service (SaaS) subscription business with a global addressable market of ยฃ2.4 billion per annum, by digitally connecting hundreds of thousands of DERs with energy markets.\n\n\nWe have already attained a market leading position and KrakenFlex is a recognised thought leader and innovator in the industry. Our efforts have not gone unnoticed and we are pleased to announce that we now have the full support and backing of Octopus Energy, an award-winning UK energy supplier who share our passion and values.\n\n\nThe energy industry is undergoing the largest transformation since industrialisation at an unprecedented rate of change, and we are positioning ourselves to be at the heart of that change. \n\n\nYour main mission, as part of our Network Intelligence group, will be to take care of and improve our grid technology platforms. We are looking for individuals who love to engage with interesting software problems, with an interest in full-stack development and the passion to build and shape the future within a collaborative, highly agile development, and community-based environment. \n\n\n\nWhat you'll do \n* Design, build, and maintain high-performance, reusable, and reliable code for our Grid Monitoring and Analytics platforms, including: centralized tools for IoT assets management, shared libraries used in all our platforms, build and deployment scripts, UI modular layers.\n* Take an active part in improving our processes and general architecture.\n* Participate in customer project delivery activities.\n* Participate actively in developing a state-of-the-art software engineering culture.\n\n\n\nWhat you'll need\n* Masterโs or Bachelorโs degree in Computer Science, Computer Engineering or equivalent,\n* 3+ years of successful professional experience in a similar role,\n* Hands-on and analytical approach to work,\n* Curious, eager to learn,\n* Initiative spirit, self-organised and attention to detail,\n* High analytical capabilities to solve complex issues. \n* Hands on and demonstrated ability to work as part of a team.\n* Excellent written and oral communication skills (French and English).\n\n\nExpertise with:\n* Git, Python 3, Bash, Docker, Debian Linux, Typescript, ReactJS, VueJS, CSS.\n* Networking, security best practices.\n* Unit, integration, and system testing.\n* System design methodologies (UML, SA/RT).\n\nGood knowledge of:\n* Cloud infrastructure (GCP, AWS, or Azure).\n* Databases (SQL and No-SQL).\n* Golang.\n* IoT technologies.\n* Operational considerations: performance tuning, monitoring.\n\n\n\nWhy else you'll love it here\n* Wondering what the salary for this role is? Just ask us! On a call with one of our recruiters it's something we always cover as we genuinely want to match your experience with the correct salary. The reason why we don't advertise is because we honestly have a degree of flexibility and would never want salary to be a reason why someone doesn't apply to Octopus - what's more important to us is finding the right octofit!\n* Octopus Energy is a unique culture. An organisation where people learn, decide, and build quicker. Where people work with autonomy, alongside a wide range of amazing co-owners, on projects that break new ground. We want your hard work to be rewarded with perks you actually care about! We won best company to work for in 2022, on Glassdoor we were voted 50 best places to work in 2022 and our Group CEO, Greg has recorded a podcast about our culture and how we empower our people \n* Visit our perks hub - Octopus Employee Benefits\n\n\n\n\n\nIdeally you will be based in the Greater Manchester and happy to come into the office a couple of days per week! But we appreciate that things have changed and flexibility is at the top of everyone's agenda, so if you would rather be remote please let us know. \n\n\nIf this sounds like you then we'd love to hear from you. \n\n\nStudies have shown that some groups of people, like women, are less likely to apply to a role unless they meet 100% of the job requirements. Whoever you are, if you like one of our jobs, we encourage you to apply as you might just be the candidate we hire. Across Octopus, we're looking for genuinely decent people who are honest and empathetic. Our people are our strongest asset and the unique skills and perspectives people bring to the team are the driving force of our success. As an equal opportunity employer, we do not discriminate on the basis of any protected attribute. Our commitment is to provide equal opportunities, an inclusive work environment, and fairness for everyone.\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nManchester, UK
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Wagmo is hiring a Remote Senior Analytics Engineer
What We Do\n\nWagmo is a new type of pet health company. We are a pet parent's partner in pet parenting - from everyday care to rainy-day emergencies. We offer wellness plans that keep our members on budget and insurance plans that keep them ready for the unexpected either directly or through a pet parentโs employer benefits. Our members should always feel confident they are giving their pets the care they deserve. No matter how a member joins us, Wagmo is a dedicated partner for the whole journey and you can help drive that with your unique talents.\nWhat's Important To Us\n\nWe solve hard problems all day long, but hang out with our pets while we do it. We value authenticity and efficiency, and have no time for egos. We prioritize performance over pedigree, compensate fairly, and never take ourselves too seriously.\n\nOur values are core to who we are and how we operate. We talk about them all the time. These are not just things posted on a wall. We interview for them, hold each other accountable to them, and make sure we work with every single person we interact with in a way that's consistent with these values.\nAbout the role \n\nThe Senior Analytics Engineer will anchor our new in-house data team. This role balances architecting and building nimble data architecture with the curiosity of a business data analyst. This person will be responsible for delivering ad hoc data requests, dashboards, detailed analyses, and helping Wagmo select its future data toolset. We have a diverse, lean engineering team and a culture where we take work seriously without taking ourselves too seriously.\n\nWagmo is growing our business to business product features (benefit eligibility, benefit administration system integration), backend operations (eligibility, deductions, billing), and the ways we continue to service our customers to ensure we make pet wellness and care available to all pets and pet parents (innovative AI automation technology, streamlined internal tools). \n\nOur Tech Stack: Python and Go (Golang), postgres, airbyte, dbt, BigQuery, Superset, Sigma\nWhat You'll Do\n\n\n* Design, develop, and implement new data features used by our internal teams, employers, and pet parents\n\n* Understand and communicate technical feasibility by evaluating situations, insights, problem definitions, requirements, designs, existing data solution development, and proposed solutions.\n\n* Documents and demonstrates data solutions by developing dbt and schema documentation, flowcharts, layouts, diagrams, charts, code comments, data dictionaries, and clear code.\n\n* Nurture you and your peers' technical development by studying state-of-the-art data tools, programming techniques, and computing equipment, and by participating in educational opportunities, reading professional publications, maintaining personal networks, and participating in professional organizations.\n\n* Collaborate with our operations team and product team to identify appropriate solutions for data issues\n\n\n\nWhat You'll Need to Be Successful\n\n\n* 5+ years working in analytics engineering as an individual contributor in a nimble startup environment\n\n* Comfortable working on underlying data components of an application and collaborating with other full stack engineers to develop larger components\n\n* Experience in our tech stack\n\n* Like to be an over-communicator (weโre remote so communication is paramount)\n\n* You see obstacles as a fun challenge not a roadblock\n\n* You are a source of calm; eliminating chaos for your team and making things less ambiguous so everyone knows the way forward\n\n\n\nKey Benefits\n\n\n* Pay Range: $140k+ Equity in the company\n\n* Company paid medical premiums\n\n* Dental, vision, voluntary life, short term disability and long term disability\n\n* Company paid Wagmo pet wellness and insurance plans\n\n* Unlimited paid time off\n\n* 12 weeks parental time off\n\n* ClassPass subsidy\n\n* 401k\n\n* Company wide open feedback model\n\n\n\n \n\nWe here at Wagmo strive to build a workforce composed of individuals with diverse backgrounds, abilities, minds, and identities that will help us to grow, not only as a company, but also as individuals. Wagmo is an Equal Opportunity Employer.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nNew York City, New York, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Memora Health is hiring a Remote Senior Data Engineer
\nMemora Health works with leading healthcare organizations to make complex care journeys simple for patients and clinicians so that care is more accessible, actionable, and always-on. Our team is rapidly growing as we expand our programs to reach more health systems and patients, and we are excited to bring on a Senior Data Engineer. \n\nIn this role, you will have the responsibility of driving the architecture, design and development of our data warehouse and analytics solutions, alongside APIs that allow other internal teams to interact with our data. The ideal candidate will be able to collaborate effectively with Memoraโs Product Management, Engineering, QA, TechOps and business stakeholders.\n\nThis role will work closely with the cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions. Ideal candidates will be driven not only by the problem we are solving but also by the innovative approach and technology that we are applying to healthcare - looking to make a significant impact on healthcare delivery. Weโre looking for someone with exceptional curiosity and enthusiasm for solving hard problems.\n\n Primary Responsibilities:\n\n\n* Collaborate with Technical Lead, fellow engineers, Product Managers, QA, and TechOps to develop, test, secure, iterate, and scale complex data infrastructure, data models, data pipelines, APIs and application backend functionality.\n\n* Work closely with cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions\n\n* Promote product development best practices, supportability, and code quality, both through leading by example and through mentoring other software engineers\n\n* Manage and pare back technical debts and escalate to Technical Lead and Engineering Manager as needed\n\n* Establish best practices designing, building and maintaining data models.\n\n* Design and develop data models and transformation layers to support reporting, analytics and AI/ML capabilities.\n\n* Develop and maintain solutions to enable self-serve reporting and analytics.\n\n* Build robust, performant ETL/ELT data pipelines.\n\n* Develop data quality monitoring solutions to increase data quality standards and metrics accuracy.\n\n\n\n\nQualifications (Required):\n\n\n* 3+ years experience in shipping, maintaining, and supporting enterprise-grade software products\n\n* 3+ years of data warehousing / analytics engineering\n\n* 3+ years of data modeling experience\n\n* Disciplined in writing readable, testable, and supportable code in JavaScript, TypeScript, Node.js (Express), Python (Flask, Django, or FastAPI), or Java.\n\n* Expertise writing, and consuming RESTful APIs\n\n* Experience with relational or NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis, etc.)\n\n* Experience with Data Warehouses (BigQuery, Snowflake, etc.)\n\n* Experience with analytical and reporting tools, such as Looker or Tableau\n\n* Inclination toward test-driven development and test automation\n\n* Experience with scrum methodology\n\n* Excels in mentoring junior engineers\n\n* B.S. in Computer Science or other quantitative fields or related work experience\n\n\n\n\nQualifications (Bonus):\n\n\n* Understanding of DevOps practices and technologies (Docker, Kubernetes, CI / CD, test coverage and automation, branch and release management)\n\n* Experience with security tooling in SDLC and Security by Design principles\n\n* Experience with observability and APM tooling (Sumo Logic, Splunk, Sentry, New Relic, Datadog, etc.)\n\n* Experience with an integration framework (Mirth Connect, Mule ESB, Apache Nifi, Boomi, etc..)\n\n* Experience with healthcare data interoperability frameworks (FHIR, HL7, CCDA, etc.)\n\n* Experience with healthcare data sources (EHRs, Claims, etc.)\n\n* Experience working at a startup\n\n\n\n\n\n\nWhat You Get:\n\n\n* An opportunity to work on a rapidly scaling care delivery platform, engaging thousands of patients and care team members and growing 2-3x annually\n\n* Enter a highly collaborative environment and work on the fun challenges of scaling a high-growth startup\n\n* Work alongside world-class clinical, operational, and technical teams to build and scale Memora\n\n* Shape how leading health systems and plans think about modernizing the care delivery experience for their patients and care teams\n\n* Improve the way care is delivered for hundreds of thousands of patients\n\n* Gain deep expertise about healthcare transformation and direct customer exposure with the countryโs most innovative health systems and plans\n\n* Ownership over your success and the ability to significantly impact the growth of our company\n\n* Competitive salary and equity compensation with benefits including health, dental, and vision coverage, flexible work hours, paid maternity/paternity leave, bi-annual retreats, Macbook, and a 401(k) plan\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nOur crypto wallet is used by millions of people and provides a single, convenient solution for managing all your accounts and tokens across Solana, Ethereum, and Polygon. As a data engineer at Phantom you will see a direct correlation between your work, company growth, and our usersโ satisfaction. Beyond this, you will work with some of the brightest minds in the web3 space, and you'll have a unique opportunity to solve some of the most interesting data challenges with efficiency and integrity, at a scale few web3 companies can match.\n\n\n* Snowflake / dbt / Rudderstack / Real-time Data Streaming / Python / SQL\n\n\n\nResponsibilities\n\n\n* \nArchitect: Design, build and launch extremely efficient and reliable data pipelines (ETL) to move data across a number of platforms including third party analytics, frontend & backend systems.\n\n* \nEducate teams: Use your data and analytics experience to see whatโs missing, and identify and address gaps in their existing systems and processes.\n\n* \nPartnership: Partner with stakeholders to understand business requirements, work with cross-functional data and products teams and build efficient and scalable data solutions\n\n* \nData: Manage the delivery of high-impact dashboards, tools, and data visualizations\n\n\n\nQualifications\n\n\n* 5+ years of experience in SQL or similar languages, and development experience in at least one language (Python, Javascript, etc.)\n\n* 5+ years of experience in the data warehouse space, custom ETL design, implementation, and maintenance.\n\n* Experience in leading data-driven projects from definition through interpretation and execution\n\n* Experience with data architecture, data modeling, schema design, and software development\n\n* Experience working with cloud analytics platforms and tools, specifically Snowflake, dbt, and Rudderstack\n\n* Bonus: Experience with blockchain or cryptocurrencies, real-time data streaming, and startup environments.\n\n* This role is fully remote; however, weโre only open to candidates based in US and EU time zones.\n\n\n\n\nThe target base salary for this role will range between $150,000 to $250,000 with the addition of equity and benefits. This is determined by a few factors including your skillset, prior relevant experience, quality of interviews and market factors at the point in time of offer. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Crypto, Python, Cloud, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.