\nWhy TrueML?\n \nTrueML is a mission-driven financial software company that aims to create better customer experiences for distressed borrowers. Consumers today want personal, digital-first experiences that align with their lifestyles, especially when it comes to managing finances. TrueMLโs approach uses machine learning to engage each customer digitally and adjust strategies in real time in response to their interactions.\n \nThe TrueML team includes inspired data scientists, financial services industry experts and customer experience fanatics building technology to serve people in a way that recognizes their unique needs and preferences as human beings and endeavoring toward ensuring nobody gets locked out of the financial system.\n\n\nAbout the Role:\n\n\nAs a Senior Data Engineer II, you will play a pivotal role in designing, building, and maintaining our cutting-edge data LakeHouse platform. You will leverage open table formats like Apache Iceberg to create scalable, reliable data solutions that enable optimized query performance across a broad spectrum of analytical workloads and emerging data applications. In this role, you'll develop and operate robust data pipelines, integrating diverse source systems and implementing efficient data transformations for both batch and streaming data.\n\n\n\nWork-Life Benefits\n* Unlimited PTO\n* Medical benefit contributions in congruence with local laws and type of employment agreement\n\n\n\nWhat you'll do:\n* Building Data LakeHouse: In the Senior Data Engineer II role, you will design, build, and operate robust data lakehouse solutions utilizing open table formats like Apache Iceberg. Your focus will be on delivering a scalable, reliable data lakehouse with optimized query performance for a wide range of analytical workloads and emerging data applications.\n* Pipeline and Transformation: Integrate with diverse source systems and construct scalable data pipelines. Implement efficient data transformation logic for both batch and streaming data, accommodating various data formats and structures.\n* Data Modeling: Analyze business requirements and profile source data to design, develop, and implement robust data models and curated data products that power reporting, analytics, and machine learning applications.\n* Data Infrastructure: Develop and manage a scalable AWS cloud infrastructure for the data platform, employing Infrastructure as Code (IaC) to reliably support diverse data workloads. Implement CI/CD pipelines for automated, consistent, and scalable infrastructure deployments across all environments, adhering to best practices and company standards.\n* Monitoring and Maintenance: Monitor data workloads for performance and errors, and troubleshoot issues to maintain high levels of data quality, freshness, and adherence to defined SLAs.\n* Collaboration: Collaborate closely with Data Services and Data Science colleagues to drive the evolution of our data platform, focusing on delivering solutions that empower data users and satisfy stakeholder needs throughout the organization.\n\n\n\nA successful candidate will have:\n* Bachelor's degree in Computer Science, Engineering, or a related technical field (Master's degree is a plus).\n* 5+ years of hands-on engineering experience (software or data), with a strong emphasis on 3+ years in data-focused roles.\n* Experience implementing data lake and data warehousing platforms.\n* Strong Python and SQL skills applied to data engineering tasks.\n* Proficiency with the AWS data ecosystem, including services like S3, Glue Catalog, IAM, and Secrets Manager.\n* Experience with Terraform and Kubernetes.\n* Track record of successfully building and operationalizing data pipelines.\n* Experience working with diverse data stores, particularly relational databases.\n\n\n\nYou might also have:\n* Experience with Airflow, DBT, and Snowflake. \n* Certification in relevant technologies or methodologies.\n* Experience with streaming processing technology, e.g., Flink, Spark Streaming.\n* Familiarity with Domain-Driven Design principles and event-driven architectures.\n* Certification in relevant technologies or methodologies.\n\n\n\n\n\n$62,000 - $77,000 a yearCompensation Disclosure: This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience, and other relevant factors.\n\nThis role is only approved to hire within the following LatAm countries: Mexico, Argentina, or Dominican Republic.\n\n\nWe are a dynamic group of people who are subject matter experts with a passion for change. Our teams are crafting solutions to big problems every day. If youโre looking for an opportunity to do impactful work, join TrueML and make a difference.\n\n\nOur Dedication to Diversity & Inclusion\n \nTrueML is an equal-opportunity employer. We promote, value, and thrive with a diverse & inclusive team. Different perspectives contribute to better solutions, and this makes us stronger every day. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $135,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Government Employees Insurance Company is hiring a
Remote Staff Engineer
Our Senior Staff Engineer works with our Staff and Sr. Engineers to innovate and build new systems, improve and enhance existing systems and identify new opportunities to apply your knowledge to solve critical problems. You will lead the strategy and execution of a technical roadmap that will increase the velocity of delivering products and unlock new engineering capabilities. The ideal candidate is a self-starter that has deep technical expertise in their domain. Position Responsibilities As a Senior Staff Engineer, you will: Provide technical leadership to multiple areas and provide technical and thought leadership to the enterprise Collaborate across team members and across the tech organization to solve our toughest problems Develop and execute technical software development strategy for a variety of domains Accountable for the quality, usability, and performance of the solutions Utilize programming languages like C#, Java, Python or other object-oriented languages, SQL, and NoSQL databases, Container Orchestration services including Docker and Kubernetes, and a variety of Azure tools and services Be a role model and mentor, helping to coach and strengthen the technical expertise and know-how of our engineering and product community. Influence and educate executives Consistently share best practices and improve processes within and across teams Analyze cost and forecast, incorporating them into business plans Determine and support resource requirements, evaluate operational processes, measure outcomes to ensure desired results, and demonstrate adaptability and sponsoring continuous learning Qualifications Exemplary ability to design, perform experiments, and influence engineering direction and product roadmap Experience partnering with engineering teams and transferring research to production Extensive experience in leading and building full-stack application and service development, with a strong focus on SAAS products / platforms. Proven expertise in designing and developing microservices using C#, gRPC, Python, Django, Kafka, and Apache Spark, with a deep understanding of both API and event-driven architectures. Proven experience designing and delivering highly-resilient event-driven and messaging based solutions at scale with minimal latency. Deep hands-on experience in building complex SAAS systems in large scale business focused systems, with great knowledge on Docker and Kubernetes Fluency and Specialization with at least two modern OOP languages such as C#, Java, C++, or Python including object-oriented design Great understanding of open-source databases like MySQL, PostgreSQL, etc. And strong foundation with No-SQL databases like Cosmos, Cassandra. Apache Trino etc. In-depth knowledge of CS data structures and algorithms Ability to excel in a fast-paced, startup-like environment Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Experience with Micro-services oriented architecture and extensible REST APIs Experience building the architecture and design (architecture, design patterns, reliability, and scaling) of new and current systems Experience in implementing security protocols across services and products: Understanding of Active Directory, Windows Authentication, SAML, OAuth Fluency in DevOps Concepts, Cloud Architecture, and Azure DevOps Operational Framework Experience in leveraging PowerShell scripting Experience in existing Operational Portals such as Azure Portal Experience with application monitoring tools and performance assessments Experience in Azure Network (Subscription, Security zoning, etc.) Experience 10+ years full-stack development experience (C#/Java/Python/GO), with expertise in client-side and server-side frameworks. 8+ years of experience with architecture and design 6+ years of experience in open-source frameworks 4+ years of experience with AWS, GCP, Azure, or another cloud service Education Bachelorโs degree in Computer Science, Information Systems, or equivalent education or work experience Annual Salary $115,000.00 - $260,000.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidateโs work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. Benefits: As an Associate, youโll enjoy our Total Rewards Program* to help secure your financial future and preserve your health and well-being, including: Premier Medical, Dental and Vision Insurance with no waiting period** Paid Vacation, Sick and Parental Leave 401(k) Plan Tuition Reimbursement Paid Training and Licensures *Benefits may be different by location. Benefit eligibility requirements vary and may include length of service. **Coverage begins on the date of hire. Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants. For more than 75 years, GEICO has stood out from the rest of the insurance industry! We are one of the nation's largest and fastest-growing auto insurers thanks to our low rates, outstanding service and clever marketing. We're an industry leader employing thousands of dedicated and hard-working associates. As a wholly owned subsidiary of Berkshire Hathaway, we offer associates training and career advancement in a financially stable and rewarding workplace. Opportunities for Students & Grads Learn more about GEICO Learn more about GEICO Diversity and Inclusion Learn more about GEICO Benefits \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, SaaS, Python, Docker, DevOps, Education, Cloud, API, Senior and Engineer jobs that are similar:\n\n
$47,500 — $97,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nMD Chevy Chase (Office) - JPS
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\n\n\n\nWhat you'll be doing:\n* Design, build and maintain ETL/ELT data pipeline that ingests data from multiple sources with different technologies (MySQL, BigQuery, Cloud Storage, CSV, FHIR, JSON, etc.);\n* Architect and maintain Swordโs (complex) data warehouse; We own lots of different types of data;\n* Work with complex data architectures, like Kappa and lambda architectures. Youโll have the opportunity to work with batch and/or streaming processes;\n* Have the opportunity to work with some of the most exciting and newest technologies in the data field;\n* Work closely with our Algorithms and AI teams, supporting their needs for data and pipelines.\n\n\n\nWhat you need to have:\n* Strong experience with Python programming language;\n* Previous experience in Software Engineering is considered;\n* Very high level of SQL and data modelling expertise;\n* Knowledge about other database types, like NoSQL databases is considered;\n* Experience in cloud-based architectures, such as GCP and AWS;\n* Experience with orchestration frameworks like Apache Airflow;\n* Proactivity and capability of proposing improvements to the team, being technical improvements or processual improvements.\n\n\n\nWhat we would love to see:\n* Knowledge of Data Governance;\n* Experience with Modern Data Stack such as DBT, Bigquery;\n* Experience with Kafka Ecosystem. E.g. Kafka, Kafka Connect, Schema Registry and others;\n* Experience with IaC (Terraform), Containerization (Docker, Kubernetes) and CI/CD;\n* Capable of Zooming out and being able to propose changes to data architecture.\n\n\n\nTo ensure you feel good solving a big Human problem, we offer:\n* A stimulating, fast-paced environment with lots of room for creativity;\n* A bright future at a promising high-tech startup company;\n* Career development and growth, with a competitive salary;\n* The opportunity to work with a talented team and to add real value to an innovative solution with the potential to change the future of healthcare;\n* A flexible environment where you can control your hours (remotely) with unlimited vacation; \n* Access to our health and well-being program (digital therapist sessions);\n* Remote or Hybrid work policy (Portugal only);\n* To get to know more about our Tech Stack, check here.\n\n\n\n\n\n\n* Please note that this position does not offer relocation assistance. Candidates must possess a valid EU visa and be based in Portugal. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$55,000 — $95,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nPorto
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nCAPCO POLAND \n\n*We are looking for Poland based candidate. The job is remote but may require some business trips.\n\nJoining Capco means joining an organisation that is committed to an inclusive working environment where youโre encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. Itโs important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table โ so weโd love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.\n\nCapco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.\n\nWe also are experts in focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.\n\nWe're seeking a skilled Mid Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.\n\n \n\nTHINGS YOU WILL DO\n\n\n* Design, develop and maintain robust data pipelines using Scala or Python, Spark, Hadoop, SQL for batch and streaming data processing\n\n* Collaborate with cross-functional teams to understand data requirements and design efficient solutions that meet business needs \n\n* Optimize Spark jobs and data processing workflows for performance, scalability and reliability\n\n* Ensure data quality, integrity and security throughout the data lifecycle\n\n* Troubleshoot and resolve data pipeline issues in a timely manner to minimize downtime and impact on business operations\n\n* Stay updated on industry best practices, emerging technologies, and trends in big data processing and analytics\n\n* Document, design specifications, deployment procedures and operational guidelines for data pipelines and systems\n\n* Provide technical guidance and mentorship for new joiners\n\n\n\n\n \n\nTECH STACK: Python or Scala, OOP, Spark, SQL, Hadoop\n\nNice to have: GCP, Pub/Sub, Big Query, Kafka, Juniper, Apache NiFi, Hive, Impala, Cloudera, CI/CD\n\n \n\nSKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE\n\n\n* min. 3-4 years of experience as a Data Engineer/Big Data Engineer\n\n* University degree in computer science, mathematics, natural sciences, or similar field and relevant working experience\n\n* Excellent SQL skills, including advanced concepts\n\n* Very good programming skills in Python or Scala\n\n* Experience in Spark and Hadoop\n\n* Experience in OOP\n\n* Experience using agile frameworks like Scrum\n\n* Interest in financial services and markets\n\n* Nice to have: experience or knowledge with GCP \n\n* Fluent English communication and presentation skills\n\n* Sense of humor and positive attitude\n\n\n\n\n \n\nWHY JOIN CAPCO?\n\n\n* Employment contract and/or Business to Business - whichever you prefer\n\n* Possibility to work remotely\n\n* Speaking English on daily basis, mainly in contact with foreign stakeholders and peers\n\n* Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)\n\n* Access to 3.000+ Business Courses Platform (Udemy)\n\n* Access to required IT equipment\n\n* Paid Referral Program\n\n* Participation in charity events e.g. Szlachetna Paczka\n\n* Ongoing learning opportunities to help you acquire new skills or deepen existing expertise\n\n* Being part of the core squad focused on the growth of the Polish business unit\n\n* A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients\n\n* A work culture focused on innovation and creating lasting value for our clients and employees\n\n\n\n\n \n\nONLINE RECRUITMENT PROCESS STEPS*\n\n\n* Screening call with the Recruiter\n\n* Technical interview: first stage\n\n* Client Interview\n\n* Feedback/Offer\n\n\n\n\n*The recruitment process may be modified \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Recruiter, Cloud, Senior and Engineer jobs that are similar:\n\n
$57,500 — $92,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nKrakรณw, Lesser Poland Voivodeship, Poland
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nHinge is the dating app designed to be deleted\n\n\nIn today's digital world, finding genuine relationships is tougher than ever. At Hinge, weโre on a mission to inspire intimate connection to create a less lonely world. Weโre obsessed with understanding our usersโ behaviors to help them find love, and our success is defined by one simple metricโ setting up great dates. With tens of millions of users across the globe, weโve become the most trusted way to find a relationship, for all.\n\n\nCollaborate with a cross-disciplinary team to build features and work with other engineers to plan out projects. Design and build backend systems with an emphasis on quality and scalability. Own complex projects end-to-end and effectively communicate to stakeholders. Work in a cloud native tech stack: Kubernetes, AWS, Go web services, Postgres, Redis, Kafka. Be a thought partner for backend team strategy and technical direction. Create and maintain feedback cycles with your peers and manager. Operate and maintain production systems. Support and mentor junior developers. Assist with team hiring and learning. Use strong communication skills (written and verbal) to provide product and project ideas to contribute to trust and safety goal. Telecommuting may be permitted. When not telecommuting, must report to 809 Washington St, New York, NY 10014. Salary: $169,229 - $220,000 per year.\n \n \nMinimum Requirements: Bachelor's degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field, plus 5 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. In lieu of a Bachelor's degree plus 5 years of experience, the employer will accept a Masterโs degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field plus 3 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. Must also have the following: 3 years of professional experience building backend infrastructures for consumer-facing features (Business to Consumer) built on iOS and Android; 3 years of professional experience handling large volumes (millions daily) of data within AWS using Python and Golang scripting languages and handling cloud-based container including Docker and Kubernetes; 3 years of professional experience handling data and event streaming using Apache Spark and handling data storage using relational databases including MySQL and NoSQL database including PostgreSQL and Redis; 3 years of professional experience performing and employing software engineering best practices for the full software development life cycle (including coding standards, code reviews, source control management, build processes, testing, and operations); 2 years of professional experience performing backend software engineering (including leading and collaborating with Internal Tooling, Bad Actor Detection, Privacy & Compliance and Safety Product teams across web and apps) and developing backend infrastructures to drive systems that support the trust and safety of users with microservices written in Golang; 2 years of professional experience leading and creating project roadmaps of deployments for B2C web applications and mobile apps (including iOS and Android) and breaking down step to designate to peers; and 2 years of professional experience reviewing peer code and mentoring junior engineers.\n \nPlease send resume to: [email protected]. Please specify ad code [WLLL].\n\n\n\n\n$169,229 - $220,000 a yearFactors such as scope and responsibilities of the position, candidate's work experience, education/training, job-related skills, internal peer equity, as well as market and business considerations may influence base pay offered. This salary range is reflective of a position based in New York, New York.\n\n#LI-DNI\n\n\nAs a member of our team, youโll enjoy:\n\n\n401(k) Matching: We match 100% of the first 10% of pre-tax 401(k) contributions you make, up to a maximum of $10,000 per year.\n\n\nProfessional Growth: Get a $3,000 annual Learning & Development stipend once youโve been with us for three months. You also get free access to Udemy, an online learning and teaching marketplace with over 6000 courses, starting your first day.\n\n\nParental Leave & Planning: When you become a new parent, youโre eligible for 100% paid parental leave (20 paid weeks for both birth and non-birth parents.)\n\n\nFertility Support: Youโll get easy access to fertility care through Carrot, from basic treatments to fertility preservation. We also provide $10,000 toward fertility preservation. You and your spouse/domestic partner are both eligible.\n\n\nDate Stipend: All Hinge employees receive a $100 monthly stipend for epic datesโ Romantic or otherwise. Hinge Premium is also free for employees and their loved ones.\n\n\nERGs: We have eight Employee Resource Groups (ERGs)โAsian, Unapologetic, Disability, LGBTQIA+, Vibras, Women/Nonbinary, Parents, and Remoteโthat hold regular meetings, host events, and provide dedicated support to the organization & its community.\n\n\nAt Hinge, our core values areโฆ\n\n\nAuthenticity: We share, never hide, our words, actions and intentions.\n\n\nCourage: We embrace lofty goals and tough challenges.\n\n\nEmpathy: We deeply consider the perspective of others.\n\n\nDiversity inspires innovation\n\n\nHinge is an equal-opportunity employer. We value diversity at our company and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We believe success is created by a diverse workforce of individuals with different ideas, strengths, interests, and cultural backgrounds.\n\n\nIf you require reasonable accommodation to complete a job application, pre-employment testing, or a job interview or to otherwise participate in the hiring process, please contact [email protected]. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Docker, Cloud, NoSQL, Mobile, Senior, Junior, Golang, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nNew York, New York
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Us\n\nAt People Data Labs, weโre committed to democratizing access to high-quality B2B data and leading the emerging DaaS economy. We empower developers, engineers, and data scientists to create innovative, compliant data products at scale with our clean, easy-to-use datasets of resume, company, location, and education data consumed through our suite of APIs. \n\nPDL is an innovative, fast-growing, global team backed by world-class investors, including Craft Ventures, Flex Capital, and Founders Fund. We scour the world for people hungry to improve, curious about how things work, and willing to challenge the status quo to build something new and better.\n\nRoles & Responsibilities:\n\n\n* Build infrastructure for ingestion, transformation, and loading an exponentially increasing volume of data from a variety of sources using Spark, SQL, AWS, and Databricks\n\n* Building an organic entity resolution framework capable of correctly merging hundreds of billions of individual entities into a number of clean, consumable datasets.\n\n* Developing CI/CD pipelines and anomaly detection systems capable of continuously improving the quality of data we're pushing into production.\n\n* Devising solutions to largely-undefined data engineering and data science problems.\n\n* Work with stakeholders in Engineering and Product to assist with data-related technical issues and support their infrastructure needs\n\n\n\n\nTechnical Requirements\n\n\n* 5-7+ years industry experience with clear examples of strategic technical problem solving and implementation\n\n* Strong software development fundamentals.\n\n* Experience withPython Expertise with Apache Spark (Java, Scala, and/or Python-based)\n\n* Experience with SQL\n\n* Experience building scalable data processing systems (e.g., cleaning, transformation) from the ground up.\n\n* Experience using developer-oriented data pipeline and workflow orchestration (e.g., Airflow (preferred), dbt, dagster or similar)\n\n* Knowledge of modern data design and storage patterns (e.g., incremental updating, partitioning and segmentation, rebuilds and backfills)\n\n* Experience working in Databricks (including delta live tables, data lakehouse patterns, etc.)\n\n* Experience with cloud computing services (AWS (preferred), GCP, Azure or similar)\n\n* Experience with data warehousing (e.g., Databricks, Snowflake, Redshift, BigQuery, or similar)\n\n* Understanding of modern data storage formats and tools (e.g., parquet, ORC, Avro, Delta Lake)\n\n\n\n\nProfessional Requirements\n\n\n* Must thrive in a fast paced environment and be able to work independently\n\n* Can work effectively remotely (able to be proactive about managing blockers, proactive on reaching out and asking questions, and participating in team activities)\n\n* Strong written communication skills on Slack/Chat and in documents\n\n* You are experienced in writing data design docs (pipeline design, dataflow, schema design)\n\n* You can scope and breakdown projects, communicate and collaborate progress and blockers effectively with your manager, team, and stakeholders\n\n\n\n\nNice To Haves:\n\n\n* Degree in a quantitative discipline such as computer science, mathematics, statistics, or engineering\n\n* Experience working with entity data (entity resolution / record linkage)\n\n* Experience working with data acquisition / data integration\n\n* Expertise with Python and the Python data stack (e.g., numpy, pandas)\n\n* Experience with streaming platforms (e.g., Kafka)\n\n* Experience evaluating data quality and maintaining consistently high data standards across new feature releases (e.g., consistency, accuracy, validity, completeness)\n\n\n\n\nOur Benefits\n\n\n* Stock\n\n* Competitive Salaries\n\n* Unlimited paid time off\n\n* Medical, dental, & vision insurance \n\n* Health, fitness, and office stipends\n\n* The permanent ability to work wherever and however you want\n\n\n\n\nNo C2C, 1099, or Contract-to-Hire. Recruiters need not apply.\n\nPeople Data Labs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Education, Cloud, Senior and Engineer jobs that are similar:\n\n
$65,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Remote Senior Software Engineer Maps Data Pipeline
\nWaabi, founded by AI pioneer and visionary Raquel Urtasun, is an AI company building the next generation of self-driving technology. With a world class team and an innovative approach that unleashes the power of AI to โdriveโ safely in the real world, Waabi is bringing the promise of self-driving closer to commercialization than ever before. Waabi is backed by best-in-class investors across the technology, logistics and the Canadian innovation ecosystem.\n\n\nWith offices in Toronto and San Francisco, Waabi is growing quickly and looking for diverse, innovative and collaborative candidates who want to impact the world in a positive way. To learn more visit: www.waabi.ai\n\n\nYou will...\n- Contribute to Waabiโs cutting edge AV stack and data-driven simulator.\n- Build and own robust, petabyte-scale ETL pipelines for ingesting and aggregating multi-sensor data to produce the maps leveraged both by Waabi World, as well by our autonomy software.\n- Be part of a team of multidisciplinary Engineers, Research Scientists, and Product Managers using an AI-first approach to enable safe self-driving at scale.\n- Interact with all areas of autonomy and simulation, many of which will be direct customers of maps.\n- Have the chance to learn about, and integrate numerous cutting-edge ML models into various stages of Waabiโs data pipelines: semantic segmentation, automated map annotation, 3D surface reconstruction, etc.\n- Collaborate closely with other teams including research scientists, ML and software engineers, and system engineers to understand use cases and deliver features that improve our overall data ecosystem.\n- Bring your expertise to provide technical leadership and mentorship to other engineers, and contribute to org-wide data architecture decision-making.\n- Assist in project roadmap planning, prioritization, and delivery.\n\n\nQualifications:\n- 5+ years of experience in developing and maintaining high-performance production data pipelines, including deep understanding of cloud infrastructure and cloud storage services like AWS S3, Google Cloud Storage, and Azure Blob Storage.\n- Solid coding proficiency and knowledge in Python and a compiled language like C++ or Rust.\n- Solid understanding of cloud job orchestration, monitoring, and instrumentation best-practices.\n- Open-minded and collaborative team player with the willingness to help others.\n- Passionate about self-driving technologies, solving hard problems, and creating innovative solutions.\n\n\nBonus/nice to have:\n- Experience writing production software in Rust.\n- Experience with MapReduce frameworks (Apache Hadoop/Spark) or orchestration frameworks like Apache Beam/Apache Airflow/Google Dataflow.\n- Familiarity with robotic sensor (LiDAR, camera) data.\n- Good documentation and technical writing skills.\n- Experience working in an Agile/Scrum environment.\n\n\n\n\n\nThe US yearly salary range for this role is: $129,000 - $238,000 USD in addition to competitive perks & benefits. Waabi (US) Inc.โs yearly salary ranges are determined based on several factors in accordance with the Companyโs compensation practices. The salary base range is reflective of the minimum and maximum target for new hire salaries for the position across all US locations. Note: The Company provides additional compensation for employees in this role, including equity incentive awards and an annual performance bonus.\n\n\nPerks/Benefits:\n- Competitive compensation and equity awards.\n- Health and Wellness benefits encompassing Medical, Dental and Vision coverage (for full-time employees only).\n- Unlimited Vacation.\n- Flexible hours and Work from Home support.\n- Daily drinks, snacks and catered meals (when in office).\n- Regularly scheduled team building activities and social events both on-site, off-site & virtually.\n- As we grow, this list continues to evolve! \n\n\nWaabi is an equal opportunity employer that celebrates diversity and is committed to creating a supportive, inclusive, and accessible environment for all employees. We seek applicants of all backgrounds and identities, across race, color, ethnicity, national origin or ancestry, age, citizenship, religion, sex, sexual orientation, gender identity or expression, military or veteran status, marital status, pregnancy or parental status, caregiver status, disability, or any other characteristic protected by law. We make workplace accommodations for qualified individuals with disabilities as required by applicable law. If reasonable accommodation is needed to participate in the job application or interview process please let our recruiting team know. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nToronto, CAN, San Francisco, CA & Remote - US & Canada
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Flywheel \n\n\nFlywheelโs suite of digital commerce solutions accelerate growth across all major digital marketplaces for the worldโs leading brands. We give clients access to near real-time performance measurement and improve sales, share, and profit. With teams across the Americas, Europe, APAC, and China, we offer a career with real impact, endless growth opportunities and the support you need to be the best you can be.\n\n\n\n\nOpportunity \n\n\nWe're looking for a Mid/Senior Data Engineer to join our team. The best candidates will hit the ground running and contribute to our data team as we develop and maintain necessary data automation, reports, ETL/ELT, and quality controls using leading-edge cloud technologies. You will have a deep knowledge and understanding of all stages in the software development life cycle. The ability to self-start, mentor and manage less experienced data engineers, desire to learn new technology, manage multiple priorities, and strong communication are all in your wheelhouse!\n\n\n\nWhat you'll do: \n* Write high-level, well-documented code in Python and SQLBuild data pipelines that range from simple to complex, using technologies like Apache Airflow and AWS Lambda, Step Functions, and EventBridge, and other AWS serverless technologies.\n* Build ETL pipelines with Snowflake, AWS Glue, pyspark and other ETL tools.\n* Work with a mix of structured and unstructured data across cloud-based batch and streaming architectures\n* Engage directly with technical analysts, project managers, and other technical teams to help build concise requirements and ensure timely completion of projects\n* Work with Git, CI/CD, and version control to maintain code and documentation\n* Design and vet solutions for technical problems, and solicit team feedback during the design process\n* Mentor, manage, train, and participate in paired programming in a lead capacity\n\n\n\nWho you are: \n* Must have experience with version control, GitHub, and software development life cycle\n* 4 years experience with SQL and data modeling\n* 4 years experience developing with Python\n* Demonstrated experience interacting with RESTful APIs\n* Experience with data pipelines / batch automation in at least one major technology (e.g. Apache Airflow)\n* Experience with one of the major cloud providers (AWS-preferred)\n* AWS Serverless (lambda, eventbridge, step functions, sqs)Experience working in an agile development environmentStreaming experience (kafka, kinesis, etc.)\n* Familiarity with JiraExperience with other AWS technologies: EC2, Glue, Athena, etc.\n* Experience with additional cloud platforms beyond AWS\n* Experience developing CI/CD, automations, and quality of life improvements for developers\n\n\n\n\n\nWorking at Flywheel\n\n\nWe are proud to offer all Flywheelers a competitive rewards package and unparalleled career growth opportunities and a supportive, fun and engaging culture. \n\n\n๐ We have office hubs across the globe where team members can go to feel productive, inspired, and connected to others\n๐ด Vacation time will depend where you're located \n๐ง Great learning and development opportunities\n๐ Benefits will depend on where you're located\n๐ Volunteering opportunities\n๐ฟ Learn more about us here: Life at Flywheel\n\n\nThe Interview Process:\n\n\nEvery role starts the same, an introductory call with someone from our Talent Acquisition team. We will be looking for company and values-fit as well as your professional experience; there may be some technical role-specific questions during this call.\n\n\nEvery role is different after the initial call, but you can expect to meet several people from the team 1:1 and there might be further skill assessments in the form of a Take Home Assignment/Case Study Presentation or Pair Programming/Live Coding exercise depending on the role. In your initial call, we will walk you through exactly what to expect the process to be.\n\n\nInclusive Workforce\n\n\nAt Flywheel, our goal is to create a culture where individuals of all backgrounds feel comfortable in bringing their authentic selves to work. We want all Flywheel people to feel included and truly empowered to contribute fully to our vision and goals.\n\n\nFlywheel is an Equal Opportunity Employer and participates in E-Verify. Everyone who applies will receive fair consideration for employment. We do not discriminate based upon race, colour, religion, sex, sexual orientation, age, marital status, gender identity, national origin, disability, or any other applicable legally protected characteristics in the location in which the candidate is applying.\n\n\nIf you have any accessibility requirements that would make you more comfortable during the application and interview process, please let us know at [email protected] so that we can support you.\n\n\nPlease note,โฏWe do not accept unsolicited resumes.\n\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Serverless, Cloud and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for ZOE and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 1 year ago
\nWe are redefining how people approach their health\n\n\nZOE is combining scientific research at a scale never before imagined and cutting-edge AI to improve the health of millions. \n\n\nCreated by the worldโs top scientists, our personalised nutrition program is reimagining a fundamental human need โ eating well for your own body. Currently available in the US and the UK, ZOE is already helping > 100k ZOE members to adopt healthier habits and live better. Our work and expertise in biology, engineering, data science, and nutrition science have led to multiple breakthrough papers in leading scientific journals such as Nature Medicine, Science, The Lancet, and more.\n\n\nTo learn more, head to Spotify, Apple Podcasts, or Audible to listen to our Science & Nutrition Podcast (with 3 million listens!) \n\n\nA remote-first, high-growth startup, we are backed by founders, investors, and entrepreneurs who have built multi-billion dollar technology companies. We are always looking for innovative thinkers and builders to join our team on a thrilling mission to tackle epic health problems. Together, we can improve human health and touch millions of lives. \n\n\nWe value inclusivity, transparency, ownership, open-mindedness and diversity. We are passionate about delivering great results and learning in the open. We want our teams to have the freedom to make long-term, high-impact decisions, and the well-being of our teammates and the people around us is a top priority.\n\n\nCheck out what life is like for our tech team on ZOE Tech. \n\n\nWeโre looking for a Senior Data Engineer to take ZOE even further.\n\n\nAbout the team\n\n\nThe mission of the Core Science team is to transform research trials and data into personalised actionable recommendations that reach our members. We are currently developing a feedback loop to measure the efficacy of ZOE's nutrition and health advice, which will drive the evolution of our recommendations. In addition, the team is conducting supplementary studies in key areas, such as the microbiome, and constructing a platform to facilitate these trials alongside the main product. The team also maintains close collaboration with other stream-aligned teams to deliver scientific discoveries directly to the app.\n\n\nWe operate in a very dynamic and rewarding environment, where we work closely with all sorts of stakeholders to find the best solutions for both the business and our potential customers. Our agile, cross-functional teams use continuous delivery and regular feedback to ensure we deliver value to our customers on a daily basis. Our systems make use of Python, dbt, Apache Airflow, Kotlin, Typescript, React, and FastAPI. We deploy and operate our software using Kubernetes and ML models using VertexAI in GCP. \n\n\nAbout the role\n\n\nAs a Senior data engineer in the Core Science team, you will be working with scientists, data scientists and other engineers to build a platform that empowers our team to conduct scientific research trials and improve the efficacy of ZOEโs nutrition and health advice. Every line of code you write will be a catalyst for groundbreaking discoveries.\n\n\nIn this role, you will also have the opportunity to make a significant impact on the personal and professional development of our team by providing guidance, support, and expertise. You will play a crucial role in helping individuals achieve their goals, overcome challenges, and maximise their potential.\n\n\n\nYou'll be\n* Defining the data requirements from the research trials that the core Science team will run alongside data coming from the main product experience.\n* Automating data collection from a variety of sources (e.g. labs, questionnaires, study coordination tools).orchestrating integration of data derived from these trials into our data warehouse. \n* Coordinating with different product teams to ensure a seamless App experience for both study participants and paid customers. \n* Ensuring consistency and accuracy of all study data used for research and product development. \n* Conducting exploratory data analysis to understand data patterns and trends. \n* Creating algorithms and ML models when necessary. \n* Ensuring data security and compliance with regulatory standards.\n* Ensuring data accessibility to internal and external stakeholders with up-to-date documentation on data sources and schemes. \n\n\n\nWe think youโll be great if you...\n* +6 years of experience in data engineering roles, with a proven track record of working on data integration, ETL processes, and data warehousing \n* Are proficient in Python and SQL and have experience with Data Warehouses (ex: BigQuery, SnowFlake) and interactive computing environments like Jupyter Notebooks.\n* Have knowledge of data governance principles and best practices for ensuring data quality, security, and compliance with regulatory standards.\n* Are detail-oriented and data-savvy to ensure the accuracy and reliability of the data. \n* Are someone who strives to keep their code clean, tests complete and maintained, and their releases frequent.\n* Have experience with cloud platforms like Google Cloud Platform (GCP) and platforms to schedule and monitor data workflows like Apache Airflow.\n* Have a solid understanding of best practices around CI/CD, containers and what a great release process looks like. \n* Have the ability to collaborate effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders.\n* Have a mindset of collaboration, innovation, and a passion for contributing to groundbreaking scientific discoveries.\n\n\n\nNice to have\n* Have experience with dbt and Apache Airflow. \n* Have experience with ML modelling and ML Ops. \n* Have experience with privacy-preserving technologies such as federated learning and data synthesis. \n\n\n\n\n\nThese are the ideal skills, attributes, and experience weโre looking for in this role. Donโt worry if you donโt tick all the boxes, especially on the skills and experience front, weโre happy to upskill for the right candidate. \n\n\nLife as a ZOEntist โ what you can expect from us:\nAs well as industry-benchmarked compensation and all the hardware and software you need, we offer a thoughtfully-curated list of benefits. We expect this list to evolve as we continue supporting our team membersโ long-term personal and professional growth, and their wellbeing. \n\n\nRemote-first: Work flexibly โ from home, our London office, or anywhere within the EU \nStock options: So you can share in our growth \nPaid time off: 28 days paid leave (25 holiday days, plus 2 company-wide reset days, and 1 โlife eventโ day) \nEnhanced Parental Leave: On top of the statutory offering\nFlexible private healthcare and life assurance options\nPension contribution: Pay monthly or top up โ your choice. \nHealth and wellbeing: Like our Employee Assistance Program and Cycle to Work Scheme\nSocial, WFH, and Growth (L&D) budgets. Plus, multiple opportunities to connect, grow, and socialise \n\n\nWeโre all about equal opportunities \nWe know that a successful team is made up of diverse people, able to be their authentic selves. To continue growing our team in the best way, we believe that equal opportunities matter, so we encourage candidates from any underrepresented backgrounds to apply for this role. You can view our Equal Opportunities statement in full here. \n\n\nA closer look at ZOE \nThink youโve heard our name somewhere before? We were the team behind the COVID Symptom Study, which has since become the ZOE Health Study (ZHS). We use the power of community science to conduct large-scale research from the comfort of contributorsโ own homes. Our collective work and expertise in biology, engineering, and data/nutrition science have led to multiple breakthrough papers in leading scientific journals such as Nature Medicine, Science, The Lancet, and more.\n\n\nSeen ZOE in the media recently? Catch our co-founder Professor Tim Spector (one of the worldโs most cited scientists) and our Chief Scientist Dr Sarah Berry on this BBC Panorama, and listen to CEO Jonathan Wolf unpack the latest in science and nutrition on our ZOE podcast. \n\n\nOh, and if youโre wondering why ZOE? It translates to โLifeโ in Greek, which weโre helping ZOE members enjoy to the fullest. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nUK/EU or compatible timezone (Remote)
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.