\nMemora Health works with leading healthcare organizations to make complex care journeys simple for patients and clinicians so that care is more accessible, actionable, and always-on. Our team is rapidly growing as we expand our programs to reach more health systems and patients, and we are excited to bring on a Senior Data Engineer. \n\nIn this role, you will have the responsibility of driving the architecture, design and development of our data warehouse and analytics solutions, alongside APIs that allow other internal teams to interact with our data. The ideal candidate will be able to collaborate effectively with Memoraโs Product Management, Engineering, QA, TechOps and business stakeholders.\n\nThis role will work closely with the cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions. Ideal candidates will be driven not only by the problem we are solving but also by the innovative approach and technology that we are applying to healthcare - looking to make a significant impact on healthcare delivery. Weโre looking for someone with exceptional curiosity and enthusiasm for solving hard problems.\n\n Primary Responsibilities:\n\n\n* Collaborate with Technical Lead, fellow engineers, Product Managers, QA, and TechOps to develop, test, secure, iterate, and scale complex data infrastructure, data models, data pipelines, APIs and application backend functionality.\n\n* Work closely with cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions\n\n* Promote product development best practices, supportability, and code quality, both through leading by example and through mentoring other software engineers\n\n* Manage and pare back technical debts and escalate to Technical Lead and Engineering Manager as needed\n\n* Establish best practices designing, building and maintaining data models.\n\n* Design and develop data models and transformation layers to support reporting, analytics and AI/ML capabilities.\n\n* Develop and maintain solutions to enable self-serve reporting and analytics.\n\n* Build robust, performant ETL/ELT data pipelines.\n\n* Develop data quality monitoring solutions to increase data quality standards and metrics accuracy.\n\n\n\n\nQualifications (Required):\n\n\n* 3+ years experience in shipping, maintaining, and supporting enterprise-grade software products\n\n* 3+ years of data warehousing / analytics engineering\n\n* 3+ years of data modeling experience\n\n* Disciplined in writing readable, testable, and supportable code in JavaScript, TypeScript, Node.js (Express), Python (Flask, Django, or FastAPI), or Java.\n\n* Expertise writing, and consuming RESTful APIs\n\n* Experience with relational or NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis, etc.)\n\n* Experience with Data Warehouses (BigQuery, Snowflake, etc.)\n\n* Experience with analytical and reporting tools, such as Looker or Tableau\n\n* Inclination toward test-driven development and test automation\n\n* Experience with scrum methodology\n\n* Excels in mentoring junior engineers\n\n* B.S. in Computer Science or other quantitative fields or related work experience\n\n\n\n\nQualifications (Bonus):\n\n\n* Understanding of DevOps practices and technologies (Docker, Kubernetes, CI / CD, test coverage and automation, branch and release management)\n\n* Experience with security tooling in SDLC and Security by Design principles\n\n* Experience with observability and APM tooling (Sumo Logic, Splunk, Sentry, New Relic, Datadog, etc.)\n\n* Experience with an integration framework (Mirth Connect, Mule ESB, Apache Nifi, Boomi, etc..)\n\n* Experience with healthcare data interoperability frameworks (FHIR, HL7, CCDA, etc.)\n\n* Experience with healthcare data sources (EHRs, Claims, etc.)\n\n* Experience working at a startup\n\n\n\n\n\n\nWhat You Get:\n\n\n* An opportunity to work on a rapidly scaling care delivery platform, engaging thousands of patients and care team members and growing 2-3x annually\n\n* Enter a highly collaborative environment and work on the fun challenges of scaling a high-growth startup\n\n* Work alongside world-class clinical, operational, and technical teams to build and scale Memora\n\n* Shape how leading health systems and plans think about modernizing the care delivery experience for their patients and care teams\n\n* Improve the way care is delivered for hundreds of thousands of patients\n\n* Gain deep expertise about healthcare transformation and direct customer exposure with the countryโs most innovative health systems and plans\n\n* Ownership over your success and the ability to significantly impact the growth of our company\n\n* Competitive salary and equity compensation with benefits including health, dental, and vision coverage, flexible work hours, paid maternity/paternity leave, bi-annual retreats, Macbook, and a 401(k) plan\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for So Energy and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 3 years ago
\nDescription\n\n\nthe We are changing things. We're So Energy, a fast-growing 100% renewable energy supplier in the UK. We're the leading energy supplier for customer service. We've won a host of awards too; including The Sunday Times Fast Track 100. Over the next 12 months, we are evolving into much more than just a utility provider and you could be a part of it.\n\n\nThe role\n\nWe are moving from producing several ad-hoc reports based on SQL queries to building a platform in GCP. It will be based on a data warehouse in Google Cloud Platform and with Power BI and Looker on top to provide the equivalent to a report as a service kind of functionalities.\n\nWe are looking for someone who has experience building analytics platform with tens of different source of data, someone that can leverage modern tools and programming languages to build ETL pipelines and extract intelligence out of the data.\n\nOur users (external but also internal) are extremely important to us. Understanding how their experience is throughout the various touchpoints is very important. Being able to apply data intelligence best practices to be able to understand and later forecast behaviour and trends is the main objective.\n\n\n\n* As a Data Engineer at SoEnergy, you will be responsible for building the ETL data pipelines\n\n* Strong familiarity and experience with ingestion, streaming and batch processing, data infrastructure design and data analytics\n\n* Experience running and supporting production of enterprise data platforms\n\n* Experience with relational and non-relational databases\n\n* Candidates must have some GCP knowledge and hands-on experience in Data Engineering space.\n\n* Build real-time & batch data processing pipelines to publish the data in GCP to match the evolving needs of the product and business\n\n* Providing technical support to internal teams within the organisation, and to external when required.\n\n* Coding elegant strategic solutions which are reusable and easily maintainable\n\n* Producing clear concise documentation where required.\n\n* Working closely with the Lead Data Engineer to apply the data strategy, ratifying before creating any ETL pipelines.\n\n* Pro-actively implementing solutions to understand better how internal and external users perform\n\n* Work closely with internal business teams gathering data requirements; translating them to technical artefacts and mapping them to Google Platform.\n\n\n\n\n\n\n\nRequirements\n\n\n\n* Hands on experience using Google Dataflow, GCS, cloud functions, BigQuery, DataProc, Apache Beam (Python) in designing data transformation rules for batch and data streaming\n\n* Experience in implementing Data & Analytics applications in Google Cloud Platform\n\n* Help to delivery GCP data development projects (including development/coding, testing and deployment into Production)\n\n* Provision data for analytics, data science and machine learning purposes.\n\n* Expertise in the designing of data solutions for BigQuery\n\n* Expertise in logical and physical data modelling\n\n* Solid Python programming skills and using apache beam (Python).\n\n* Experience building CI/CD pipelines and using pipeline tooling\n\n* Good communication skills, including the ability to translate technical descriptions into something that can be understood by a non-technical business-facing team member\n\n\n\n\n\n\n\nBenefits\n\n\n\n* Remote working available\n\n* Pension matching as part of auto-enrolment\n\n* 25 days holiday plus bank holidays with an extra day for your birthday!\n\n* Ongoing training and development\n\n* Cycle to work\n\n* Season Ticket Loan\n\n* An opportunity to work in a fast-changing changing industry for a leading disruptor in the field who is changing the face of the energy industry\n\n* Work in leafy Chiswick, with free breakfast, monthly drinks and a stunning new office space\n\n\n\n\nSo Energy care about helping the energy industry become a much more diverse and inclusive environment and we work hard to lead by example. We are committed to Equal Employment Opportunity and building an inclusive environment for all.#\n\n\nIf you are interested in finding out more please apply, make sure to complete all the questions to the best of your ability and attached an up to date version of your CV.\n\n\nGood luck! \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Engineer, Cloud, Python and Apache jobs that are similar:\n\n
$80,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Cisco Meraki and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 3 years ago
\nPosition: Sr. or Team Lead Machine Learning Engineer, Full Time\n\nLocation: Fully Remote anywhere in the USA or optional office location in SF\n\nAt Cisco Meraki, we know that technology can connect, empower, and drive us. Our mission is to simplify technology so our customers can focus on what's most meaningful to them: their students, patients, customers, and businesses. We’re making networking easier, faster, and smarter with technology that simply works.\n\nAs a Machine Learning engineer on the Insight team, you will collaborate with firmware and full stack engineers to design, plan, and build customer-facing analytics tools. Meraki's cloud-managed model offers a unique opportunity to draw upon data from millions of networks across our wide ranging customer base. The goal is to use the rich telemetry data available from these networks and combine it with the power of machine learning and the cloud to build an analytics engine that can provide intuitive, yet detailed insights into the performance of customer networks.\n\nWhat you can expect:\n\n\n* Build a system that ingests real-time streams of network performance data and identifies network performance degradation, optimizing for both low latency and few false positives\n\n* Design models that predict network performance for customers to help them understand their network performance issues\n\n* Work with firmware and backend engineers to design uplink selection algorithm for SD-WAN\n\n* Collaborate with full stack engineers to make intuitive data visualizations and integrate predictions seamlessly and powerfully into the user experience\n\n* Build, maintain, and monitor data pipelines and infrastructure for training and deploying models\n\n\n\n\nWhat we're ideally looking for:\n\n\n* 5+ years of relevant industry experience\n\n* Advanced training in mathematics, statistics, and modeling\n\n* Experience programming in Python AND some other programming language like scala, golang, ruby, etc.\n\n* Experience working with algorithms and building models for supervised and unsupervised learning.\n\n* Experience using data processing and ML libraries such as Pandas, Scikit-Learn, Tensorflow, Keras, etc.\n\n* Experience working with distributed computing engines like Apache Spark, etc. and real time data streaming services like Amazon Kinesis.\n\n* Experience implementing and monitoring data pipelines.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Machine Learning, Engineer, Executive, Amazon, Cloud, Python, Apache and Backend jobs that are similar:\n\n
$75,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Dealer Inspire and want to re-open this job? Use the edit link in the email when you posted the job!
\nThis position can be remote, but US based candidates only.\n\nAbout Us:\n\nDealer Inspire (DI) is a leading disruptor in the automotive industry through our innovative culture, legendary service, and kick-ass website, technology, and marketing solutions. Our mission is to future-proof local dealerships by building the essential, mobile-first platform that makes automotive retail faster, easier, and smarter for both shoppers and dealers. Headquartered in Naperville, IL, our team of nearly 600 work friends are spread across the United States and Canada, pushing the boundaries and getting **** done every day, together.\n\nDI offers an inclusive environment that celebrates collaboration and thinking differently to solve the challenges our clients face. Our shared success continues to lead to rapid growth and positive change, which opens up opportunities to advance your career to the next level by working with passionate, creative people across skill sets. If you want to be challenged, learn every day, and work as a team with some of the best in the industry, we want to meet you. Apply today!\n\nWant to learn more about who we are? Check us out here!\n\nJob Description: \nDealer Inspire is changing the way car dealerships do business through data. We are assembling a team of engineers and data scientists to help build the next generation distributed computing platform to support data driven analytics and predictive modeling.\n\nWe are looking for a Data Engineer to join the team and play a critical role in the design and implementing of sophisticated data pipelines and real time analytics streams that serve as the foundation of our data science platform. Candidates should have the following qualifications\n\nRequired Experience\n\n\n* 2-5 years experience as a data engineer in a professional setting\n\n* Knowledge of the ETL process and patterns of periodic and real time data pipelines\n\n* Experience with data types and data transfer between platforms\n\n* Proficiency with Python and related libraries to support the ETL process\n\n* Working knowledge of SQL\n\n* Experience with linux based systems console (bash, etc.)\n\n* Knowledge of cloud based AWS resources such as EC2, S3, and RDS\n\n* Able to work closely with data scientists on the demand side\n\n* Able to work closely with domain experts and data source owners on the supply side\n\n* An ability to build a data pipeline monitoring system with robust, scalable dashboards and alerts for 24/7 operations.\n\n\n\n\nPreferred Experience\n\n\n* College degree in a technical area (Computer Science, Information Technology, Mathematics or Statistics) \n\n* Experience with Apache Kafka, Spark, Ignite and/or other big data tools \n\n* Experience with Java Script, Node.js, PHP and other web technologies.\n\n* Working knowledge of Java or Scala\n\n* Familiarity with tools such as Packer, Terraform, and CloudFormation \n\n\n\n\nWhat we are looking for in a candidate:\n\n\n* Experience with data engineering, Python and SQL\n\n* Willingness to learn new technologies and a whatever-it-takes attitude towards building the best possible data science platform\n\n* A person who loves data and all things data related, AKA a self described data geek\n\n* Enthusiasm and a “get it done” attitude!\n\n\n\n\nPerks:\n\n\n* Health Insurance with BCBS, Delta Dental (Orthodontics coverage available), Eye Med Vision\n\n* 401k plan with company match\n\n* Tuition Reimbursement\n\n* 13 days paid time off, parental leave, and selected paid holidays\n\n* Life and Disability Insurance\n\n* Subsidized gym membership\n\n* Subsidized internet access for your home\n\n* Peer-to-Peer Bonus program\n\n* Work from home Fridays\n\n* Weekly in-office yoga classes\n\n* Fully stocked kitchen and refrigerator\n\n\n\n\n*Not a complete, detailed list. Benefits have terms and requirements before employees are eligible. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Engineer, Java, Cloud, PHP, Python, Marketing, Apache and Linux jobs that are similar:\n\n
$70,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Thrive Global and want to re-open this job? Use the edit link in the email when you posted the job!
\n Thrive Global is changing the way people live through our behavior change platform and apps used to impact individual and organizational well-being and productivity. The marriage of data and analytics, our best-in-class content and science-backed behavior change IP will help people go from knowing what to do to actually doing it, enabling millions of consumers to begin the Thrive behavior change journey. As a technical lead on Thrive’s Data Science and Analytics team, you will play a significant role in building Thrive’s platform and products.\n\nWho We Are Looking For\n\n\n* A versatile engineering lead who has significant experience with data at all levels of maturity: raw telemetry through deployment and maintenance of models in operations \n\n* Is excited about collaborating with others, engineering and non-engineering, both learning & teaching as Thrive grows.\n\n* An innovator looking to push the boundaries of automation, intelligent ETL, AIOps and MlOps to drive high-quality insights and operational efficiency within the team\n\n* Has a proven track record of building and shipping data-centric software products\n\n* Desires a position that is approximately 75% individual technical contributions and 25% mentoring junior engineers or serving as a trusted advisor to engineering leadership\n\n* Is comfortable in a high growth, start-up environment and is willing to wear many hats and come up with creative solutions.\n\n\n\n\nHow You’ll Contribute\n\n\n* Collaborate with the Head of Data Science and Analytics to design an architecture and infrastructure to support data engineering and machine learning at Thrive\n\n* Implement a production-grade data science platform which includes building data pipeline, automation of data quality assessments, and automatic deployment of models into production\n\n* Develop new technology solutions to ensure a seamless transition of machine learning algorithms to production software, to enable the building out of easy to use datasets and to reduce other friction points within the data science life-cycle\n\n* Assist with building a small but skilled interdisciplinary team of data professionals: scientists, analysts, and engineers\n\n* Consider user privacy and security at all times\n\n\n\n\nRequired Skills\n\n\n* Master’s or Ph.D. degree in Computer Science or a related discipline (e.g., Mathematics, Physics)\n\n* 3+ years of technical leadership, team size of 5 or more, in data engineering or machine learning projects\n\n* 8+ years of industry experience with data engineering and machine learning\n\n* Extensive programming experience in Java or Python with applications in data engineering and machine learning.\n\n* Experience with data modeling, large-scale batch, and real-time data processing, ETL design, implementation and maintenance\n\n* Excellent verbal and written communication skills\n\n* Self-starter with a positive attitude, intellectual curiosity and a passion for analytics and solving real-world problems\n\n\n\n\nRelevant Technology and Tools Experience\nA good cross-section of experience in the following areas is desired:\n\n\n* AI/ML platforms: TensorFlow, Apache MXnet, Theano, Keras, CNTK, scikit-learn, H2O, Spark MLlib, AWS SageMaker, etc.\n\n* Relational databases: MySQL, Postgres, RedShift, etc.\n\n* Big data technologies: Spark, HDFS, Hive, Yarn, etc.\n\n* Data ingestion tools: Kafka, NiFi, Storm, Amazon Kinesis, etc.\n\n* Deployment technologies: Docker, Kubernetes, or OpenStack\n\n* Public Cloud: Azure, AWS or Google Cloud Platform\n\n\n\n\nOur Mission\nThrive Global’s mission is to end the stress and burnout epidemic by offering companies and individuals sustainable, science-based solutions to enhance well-being, performance, and purpose, and create a healthier relationship with technology. Recent science has shown that the pervasive belief that burnout is the price we must pay for success is a delusion. We know, instead, that when we prioritize our well-being, our decision-making, creativity, and productivity improve dramatically. Thrive Global is committed to accelerating the culture shift that allows people to reclaim their lives and move from merely surviving to thriving.\n\nWhat We Offer\n\n\n* A mission-driven company that’s truly making a difference in the lives of people around the world \n\n* Ability to develop within the company and shape our growth strategy\n\n* Human-centric culture with a range of wellness perks and benefits\n\n* Competitive compensation package\n\n* Medical, vision and dental coverage + 401k program with company match\n\n* Generous paid time-off programs \n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Machine Learning, Engineer, Executive, Teaching, Amazon, Java, Cloud, Python, Junior and Apache jobs that are similar:\n\n
$75,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.