\nWhy TrueML?\n \nTrueML is a mission-driven financial software company that aims to create better customer experiences for distressed borrowers. Consumers today want personal, digital-first experiences that align with their lifestyles, especially when it comes to managing finances. TrueMLโs approach uses machine learning to engage each customer digitally and adjust strategies in real time in response to their interactions.\n \nThe TrueML team includes inspired data scientists, financial services industry experts and customer experience fanatics building technology to serve people in a way that recognizes their unique needs and preferences as human beings and endeavoring toward ensuring nobody gets locked out of the financial system.\n\n\nAbout the Role:\n\n\nAs a Senior Data Engineer II, you will play a pivotal role in designing, building, and maintaining our cutting-edge data LakeHouse platform. You will leverage open table formats like Apache Iceberg to create scalable, reliable data solutions that enable optimized query performance across a broad spectrum of analytical workloads and emerging data applications. In this role, you'll develop and operate robust data pipelines, integrating diverse source systems and implementing efficient data transformations for both batch and streaming data.\n\n\n\nWork-Life Benefits\n* Unlimited PTO\n* Medical benefit contributions in congruence with local laws and type of employment agreement\n\n\n\nWhat you'll do:\n* Building Data LakeHouse: In the Senior Data Engineer II role, you will design, build, and operate robust data lakehouse solutions utilizing open table formats like Apache Iceberg. Your focus will be on delivering a scalable, reliable data lakehouse with optimized query performance for a wide range of analytical workloads and emerging data applications.\n* Pipeline and Transformation: Integrate with diverse source systems and construct scalable data pipelines. Implement efficient data transformation logic for both batch and streaming data, accommodating various data formats and structures.\n* Data Modeling: Analyze business requirements and profile source data to design, develop, and implement robust data models and curated data products that power reporting, analytics, and machine learning applications.\n* Data Infrastructure: Develop and manage a scalable AWS cloud infrastructure for the data platform, employing Infrastructure as Code (IaC) to reliably support diverse data workloads. Implement CI/CD pipelines for automated, consistent, and scalable infrastructure deployments across all environments, adhering to best practices and company standards.\n* Monitoring and Maintenance: Monitor data workloads for performance and errors, and troubleshoot issues to maintain high levels of data quality, freshness, and adherence to defined SLAs.\n* Collaboration: Collaborate closely with Data Services and Data Science colleagues to drive the evolution of our data platform, focusing on delivering solutions that empower data users and satisfy stakeholder needs throughout the organization.\n\n\n\nA successful candidate will have:\n* Bachelor's degree in Computer Science, Engineering, or a related technical field (Master's degree is a plus).\n* 5+ years of hands-on engineering experience (software or data), with a strong emphasis on 3+ years in data-focused roles.\n* Experience implementing data lake and data warehousing platforms.\n* Strong Python and SQL skills applied to data engineering tasks.\n* Proficiency with the AWS data ecosystem, including services like S3, Glue Catalog, IAM, and Secrets Manager.\n* Experience with Terraform and Kubernetes.\n* Track record of successfully building and operationalizing data pipelines.\n* Experience working with diverse data stores, particularly relational databases.\n\n\n\nYou might also have:\n* Experience with Airflow, DBT, and Snowflake. \n* Certification in relevant technologies or methodologies.\n* Experience with streaming processing technology, e.g., Flink, Spark Streaming.\n* Familiarity with Domain-Driven Design principles and event-driven architectures.\n* Certification in relevant technologies or methodologies.\n\n\n\n\n\n$62,000 - $77,000 a yearCompensation Disclosure: This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience, and other relevant factors.\n\nThis role is only approved to hire within the following LatAm countries: Mexico, Argentina, or Dominican Republic.\n\n\nWe are a dynamic group of people who are subject matter experts with a passion for change. Our teams are crafting solutions to big problems every day. If youโre looking for an opportunity to do impactful work, join TrueML and make a difference.\n\n\nOur Dedication to Diversity & Inclusion\n \nTrueML is an equal-opportunity employer. We promote, value, and thrive with a diverse & inclusive team. Different perspectives contribute to better solutions, and this makes us stronger every day. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $135,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout HighLevel:\nHighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.\n\n\nOur Website - https://www.gohighlevel.com/\nYouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g\nBlog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/\n\n\nOur Customers:\nHighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.\n\n\nScale at HighLevel:\nWe operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.\n\n\nAbout the Role:\nWe are seeking a talented and motivated data engineer to join our team who will be responsible for designing, developing, and maintaining our data infrastructure and developing backend systems and solutions that support real-time data processing, large-scale event-driven architectures, and integrations with various data systems. This role involves collaborating with cross-functional teams to ensure data reliability, scalability, and performance. The candidate will work closely with data scientists, analysts and software engineers to ensure efficient data flow and storage, enabling data-driven decision-making across the organisation.\n\n\n\nResponsibilities:\n* Software Engineering Excellence: Write clean, efficient, and maintainable code using JavaScript or Python while adhering to best practices and design patterns\n* Design, Build, and Maintain Systems: Develop robust software solutions and implement RESTful APIs that handle high volumes of data in real-time, leveraging message queues (Google Cloud Pub/Sub, Kafka, RabbitMQ) and event-driven architectures\n* Data Pipeline Development: Design, develop and maintain data pipelines (ETL/ELT) to process structured and unstructured data from various sources\n* Data Storage & Warehousing: Build and optimize databases, data lakes and data warehouses (e.g. Snowflake) for high-performance querying\n* Data Integration: Work with APIs, batch and streaming data sources to ingest and transform data\n* Performance Optimization: Optimize queries, indexing and partitioning for efficient data retrieval\n* Collaboration: Work with data analysts, data scientists, software developers and product teams to understand requirements and deliver scalable solutions\n* Monitoring & Debugging: Set up logging, monitoring, and alerting to ensure data pipelines run reliably\n* Ownership & Problem-Solving: Proactively identify issues or bottlenecks and propose innovative solutions to address them\n\n\n\nRequirements:\n* 3+ years of experience in software development\n* Education: Bachelorโs or Masterโs degree in Computer Science, Engineering, or a related field\n* Strong Problem-Solving Skills: Ability to debug and optimize data processing workflows\n* Programming Fundamentals: Solid understanding of data structures, algorithms, and software design patterns\n* Software Engineering Experience: Demonstrated experience (SDE II/III level) in designing, developing, and delivering software solutions using modern languages and frameworks (Node.js, JavaScript, Python, TypeScript, SQL, Scala or Java)\n* ETL Tools & Frameworks: Experience with Airflow, dbt, Apache Spark, Kafka, Flink or similar technologies\n* Cloud Platforms: Hands-on experience with GCP (Pub/Sub, Dataflow, Cloud Storage) or AWS (S3, Glue, Redshift)\n* Databases & Warehousing: Strong experience with PostgreSQL, MySQL, Snowflake, and NoSQL databases (MongoDB, Firestore, ES)\n* Version Control & CI/CD: Familiarity with Git, Jenkins, Docker, Kubernetes, and CI/CD pipelines for deployment\n* Communication: Excellent verbal and written communication skills, with the ability to work effectively in a collaborative environment\n* Experience with data visualization tools (e.g. Superset, Tableau), Terraform, IaC, ML/AI data pipelines and devops practices are a plus\n\n\n\n\n\n\nEEO Statement:\nThe company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.\n\n\n#LI-Remote #LI-NJ1 \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, JavaScript, Cloud, API, Marketing, Sales, Engineer and Backend jobs that are similar:\n\n
$60,000 — $90,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nDelhi
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Sayari: \nSayari is the counterparty and supply chain risk intelligence provider trusted by government agencies, multinational corporations, and financial institutions. Its intuitive network analysis platform surfaces hidden risk through integrated corporate ownership, supply chain, trade transaction and risk intelligence data from over 250 jurisdictions. Sayari is headquartered in Washington, D.C., and its solutions are used by thousands of frontline analysts in over 35 countries.\n\n\nOur company culture is defined by a dedication to our mission of using open data to enhance visibility into global commercial and financial networks, a passion for finding novel approaches to complex problems, and an understanding that diverse perspectives create optimal outcomes. We embrace cross-team collaboration, encourage training and learning opportunities, and reward initiative and innovation. If you like working with supportive, high-performing, and curious teams, Sayari is the place for you.\n\n\nJob Description:\nSayariโs flagship product, Sayari Graph, provides instant access to structured business information from billions of corporate, legal, and trade records. As a member of Sayari's data team you will work with the Product and Software Engineering teams to collect data from around the globe, maintain existing data pipelines, and develop new pipelines that power Sayari Graph. \n\n\n\nJob Responsibilities:\n* Write and deploy crawling scripts to collect source data from the web\n* Write and run data transformers in Scala Spark to standardize bulk data sets\n* Write and run modules in Python to parse entity references and relationships from source data\n* Diagnose and fix bugs reported by internal and external users\n* Analyze and report on internal datasets to answer questions and inform feature work\n* Work collaboratively on and across a team of engineers using agile principles\n* Give and receive feedback through code reviews \n\n\n\nSkills & Experience:\n* Professional experience with Python and a JVM language (e.g., Scala)\n* 2+ years of experience designing and maintaining data pipelines\n* Experience using Apache Spark and Apache Airflow\n* Experience with SQL and NoSQL databases (e.g., columns stores, graph, etc.)\n* Experience working on a cloud platform like GCP, AWS, or Azure\n* Experience working collaboratively with Git\n* Understanding of Docker/Kubernetes\n* Interest in learning from and mentoring team members\n* Experience supporting and working with cross-functional teams in a dynamic environment\n* Passionate about open source development and innovative technology\n* Experience working with BI tools like BigQuery and Superset is a plus\n* Understanding of knowledge graphs is a plus\n\n\n\n\n$100,000 - $125,000 a yearThe target base salary for this position is $100,000 - $125,000 USD plus bonus. Final offer amounts are determined by multiple factors including location, local market variances, candidate experience and expertise, internal peer equity, and may vary from the amounts listed above.\n\nBenefits: \nยท Limitless growth and learning opportunities\nยท A collaborative and positive culture - your team will be as smart and driven as you\nยท A strong commitment to diversity, equity & inclusion\nยท Exceedingly generous vacation leave, parental leave, floating holidays, flexible schedule, & other remarkable benefits\nยท Outstanding competitive compensation & commission package\nยท Comprehensive family-friendly health benefits, including full healthcare coverage plans, commuter benefits, & 401K matching\n \nSayari is an equal opportunity employer and strongly encourages diverse candidates to apply. We believe diversity and inclusion mean our team members should reflect the diversity of the United States. No employee or applicant will face discrimination or harassment based on race, color, ethnicity, religion, age, gender, gender identity or expression, sexual orientation, disability status, veteran status, genetics, or political affiliation. We strongly encourage applicants of all backgrounds to apply. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud and Engineer jobs that are similar:\n\n
$50,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nHinge is the dating app designed to be deleted\n\n\nIn today's digital world, finding genuine relationships is tougher than ever. At Hinge, weโre on a mission to inspire intimate connection to create a less lonely world. Weโre obsessed with understanding our usersโ behaviors to help them find love, and our success is defined by one simple metricโ setting up great dates. With tens of millions of users across the globe, weโve become the most trusted way to find a relationship, for all.\n\n\nCollaborate with a cross-disciplinary team to build features and work with other engineers to plan out projects. Design and build backend systems with an emphasis on quality and scalability. Own complex projects end-to-end and effectively communicate to stakeholders. Work in a cloud native tech stack: Kubernetes, AWS, Go web services, Postgres, Redis, Kafka. Be a thought partner for backend team strategy and technical direction. Create and maintain feedback cycles with your peers and manager. Operate and maintain production systems. Support and mentor junior developers. Assist with team hiring and learning. Use strong communication skills (written and verbal) to provide product and project ideas to contribute to trust and safety goal. Telecommuting may be permitted. When not telecommuting, must report to 809 Washington St, New York, NY 10014. Salary: $169,229 - $220,000 per year.\n \n \nMinimum Requirements: Bachelor's degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field, plus 5 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. In lieu of a Bachelor's degree plus 5 years of experience, the employer will accept a Masterโs degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field plus 3 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. Must also have the following: 3 years of professional experience building backend infrastructures for consumer-facing features (Business to Consumer) built on iOS and Android; 3 years of professional experience handling large volumes (millions daily) of data within AWS using Python and Golang scripting languages and handling cloud-based container including Docker and Kubernetes; 3 years of professional experience handling data and event streaming using Apache Spark and handling data storage using relational databases including MySQL and NoSQL database including PostgreSQL and Redis; 3 years of professional experience performing and employing software engineering best practices for the full software development life cycle (including coding standards, code reviews, source control management, build processes, testing, and operations); 2 years of professional experience performing backend software engineering (including leading and collaborating with Internal Tooling, Bad Actor Detection, Privacy & Compliance and Safety Product teams across web and apps) and developing backend infrastructures to drive systems that support the trust and safety of users with microservices written in Golang; 2 years of professional experience leading and creating project roadmaps of deployments for B2C web applications and mobile apps (including iOS and Android) and breaking down step to designate to peers; and 2 years of professional experience reviewing peer code and mentoring junior engineers.\n \nPlease send resume to: [email protected]. Please specify ad code [WLLL].\n\n\n\n\n$169,229 - $220,000 a yearFactors such as scope and responsibilities of the position, candidate's work experience, education/training, job-related skills, internal peer equity, as well as market and business considerations may influence base pay offered. This salary range is reflective of a position based in New York, New York.\n\n#LI-DNI\n\n\nAs a member of our team, youโll enjoy:\n\n\n401(k) Matching: We match 100% of the first 10% of pre-tax 401(k) contributions you make, up to a maximum of $10,000 per year.\n\n\nProfessional Growth: Get a $3,000 annual Learning & Development stipend once youโve been with us for three months. You also get free access to Udemy, an online learning and teaching marketplace with over 6000 courses, starting your first day.\n\n\nParental Leave & Planning: When you become a new parent, youโre eligible for 100% paid parental leave (20 paid weeks for both birth and non-birth parents.)\n\n\nFertility Support: Youโll get easy access to fertility care through Carrot, from basic treatments to fertility preservation. We also provide $10,000 toward fertility preservation. You and your spouse/domestic partner are both eligible.\n\n\nDate Stipend: All Hinge employees receive a $100 monthly stipend for epic datesโ Romantic or otherwise. Hinge Premium is also free for employees and their loved ones.\n\n\nERGs: We have eight Employee Resource Groups (ERGs)โAsian, Unapologetic, Disability, LGBTQIA+, Vibras, Women/Nonbinary, Parents, and Remoteโthat hold regular meetings, host events, and provide dedicated support to the organization & its community.\n\n\nAt Hinge, our core values areโฆ\n\n\nAuthenticity: We share, never hide, our words, actions and intentions.\n\n\nCourage: We embrace lofty goals and tough challenges.\n\n\nEmpathy: We deeply consider the perspective of others.\n\n\nDiversity inspires innovation\n\n\nHinge is an equal-opportunity employer. We value diversity at our company and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We believe success is created by a diverse workforce of individuals with different ideas, strengths, interests, and cultural backgrounds.\n\n\nIf you require reasonable accommodation to complete a job application, pre-employment testing, or a job interview or to otherwise participate in the hiring process, please contact [email protected]. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Docker, Cloud, NoSQL, Mobile, Senior, Junior, Golang, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nNew York, New York
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Flywheel \n\n\nFlywheelโs suite of digital commerce solutions accelerate growth across all major digital marketplaces for the worldโs leading brands. We give clients access to near real-time performance measurement and improve sales, share, and profit. With teams across the Americas, Europe, APAC, and China, we offer a career with real impact, endless growth opportunities and the support you need to be the best you can be.\n\n\n\n\nOpportunity \n\n\nWe're looking for a Mid/Senior Data Engineer to join our team. The best candidates will hit the ground running and contribute to our data team as we develop and maintain necessary data automation, reports, ETL/ELT, and quality controls using leading-edge cloud technologies. You will have a deep knowledge and understanding of all stages in the software development life cycle. The ability to self-start, mentor and manage less experienced data engineers, desire to learn new technology, manage multiple priorities, and strong communication are all in your wheelhouse!\n\n\n\nWhat you'll do: \n* Write high-level, well-documented code in Python and SQLBuild data pipelines that range from simple to complex, using technologies like Apache Airflow and AWS Lambda, Step Functions, and EventBridge, and other AWS serverless technologies.\n* Build ETL pipelines with Snowflake, AWS Glue, pyspark and other ETL tools.\n* Work with a mix of structured and unstructured data across cloud-based batch and streaming architectures\n* Engage directly with technical analysts, project managers, and other technical teams to help build concise requirements and ensure timely completion of projects\n* Work with Git, CI/CD, and version control to maintain code and documentation\n* Design and vet solutions for technical problems, and solicit team feedback during the design process\n* Mentor, manage, train, and participate in paired programming in a lead capacity\n\n\n\nWho you are: \n* Must have experience with version control, GitHub, and software development life cycle\n* 4 years experience with SQL and data modeling\n* 4 years experience developing with Python\n* Demonstrated experience interacting with RESTful APIs\n* Experience with data pipelines / batch automation in at least one major technology (e.g. Apache Airflow)\n* Experience with one of the major cloud providers (AWS-preferred)\n* AWS Serverless (lambda, eventbridge, step functions, sqs)Experience working in an agile development environmentStreaming experience (kafka, kinesis, etc.)\n* Familiarity with JiraExperience with other AWS technologies: EC2, Glue, Athena, etc.\n* Experience with additional cloud platforms beyond AWS\n* Experience developing CI/CD, automations, and quality of life improvements for developers\n\n\n\n\n\nWorking at Flywheel\n\n\nWe are proud to offer all Flywheelers a competitive rewards package and unparalleled career growth opportunities and a supportive, fun and engaging culture. \n\n\n๐ We have office hubs across the globe where team members can go to feel productive, inspired, and connected to others\n๐ด Vacation time will depend where you're located \n๐ง Great learning and development opportunities\n๐ Benefits will depend on where you're located\n๐ Volunteering opportunities\n๐ฟ Learn more about us here: Life at Flywheel\n\n\nThe Interview Process:\n\n\nEvery role starts the same, an introductory call with someone from our Talent Acquisition team. We will be looking for company and values-fit as well as your professional experience; there may be some technical role-specific questions during this call.\n\n\nEvery role is different after the initial call, but you can expect to meet several people from the team 1:1 and there might be further skill assessments in the form of a Take Home Assignment/Case Study Presentation or Pair Programming/Live Coding exercise depending on the role. In your initial call, we will walk you through exactly what to expect the process to be.\n\n\nInclusive Workforce\n\n\nAt Flywheel, our goal is to create a culture where individuals of all backgrounds feel comfortable in bringing their authentic selves to work. We want all Flywheel people to feel included and truly empowered to contribute fully to our vision and goals.\n\n\nFlywheel is an Equal Opportunity Employer and participates in E-Verify. Everyone who applies will receive fair consideration for employment. We do not discriminate based upon race, colour, religion, sex, sexual orientation, age, marital status, gender identity, national origin, disability, or any other applicable legally protected characteristics in the location in which the candidate is applying.\n\n\nIf you have any accessibility requirements that would make you more comfortable during the application and interview process, please let us know at [email protected] so that we can support you.\n\n\nPlease note,โฏWe do not accept unsolicited resumes.\n\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Serverless, Cloud and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for ZOE and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 1 year ago
\nWe are redefining how people approach their health\n\n\nZOE is combining scientific research at a scale never before imagined and cutting-edge AI to improve the health of millions. \n\n\nCreated by the worldโs top scientists, our personalised nutrition program is reimagining a fundamental human need โ eating well for your own body. Currently available in the US and the UK, ZOE is already helping > 100k ZOE members to adopt healthier habits and live better. Our work and expertise in biology, engineering, data science, and nutrition science have led to multiple breakthrough papers in leading scientific journals such as Nature Medicine, Science, The Lancet, and more.\n\n\nTo learn more, head to Spotify, Apple Podcasts, or Audible to listen to our Science & Nutrition Podcast (with 3 million listens!) \n\n\nA remote-first, high-growth startup, we are backed by founders, investors, and entrepreneurs who have built multi-billion dollar technology companies. We are always looking for innovative thinkers and builders to join our team on a thrilling mission to tackle epic health problems. Together, we can improve human health and touch millions of lives. \n\n\nWe value inclusivity, transparency, ownership, open-mindedness and diversity. We are passionate about delivering great results and learning in the open. We want our teams to have the freedom to make long-term, high-impact decisions, and the well-being of our teammates and the people around us is a top priority.\n\n\nCheck out what life is like for our tech team on ZOE Tech. \n\n\nWeโre looking for a Senior Data Engineer to take ZOE even further.\n\n\nAbout the team\n\n\nThe mission of the Core Science team is to transform research trials and data into personalised actionable recommendations that reach our members. We are currently developing a feedback loop to measure the efficacy of ZOE's nutrition and health advice, which will drive the evolution of our recommendations. In addition, the team is conducting supplementary studies in key areas, such as the microbiome, and constructing a platform to facilitate these trials alongside the main product. The team also maintains close collaboration with other stream-aligned teams to deliver scientific discoveries directly to the app.\n\n\nWe operate in a very dynamic and rewarding environment, where we work closely with all sorts of stakeholders to find the best solutions for both the business and our potential customers. Our agile, cross-functional teams use continuous delivery and regular feedback to ensure we deliver value to our customers on a daily basis. Our systems make use of Python, dbt, Apache Airflow, Kotlin, Typescript, React, and FastAPI. We deploy and operate our software using Kubernetes and ML models using VertexAI in GCP. \n\n\nAbout the role\n\n\nAs a Senior data engineer in the Core Science team, you will be working with scientists, data scientists and other engineers to build a platform that empowers our team to conduct scientific research trials and improve the efficacy of ZOEโs nutrition and health advice. Every line of code you write will be a catalyst for groundbreaking discoveries.\n\n\nIn this role, you will also have the opportunity to make a significant impact on the personal and professional development of our team by providing guidance, support, and expertise. You will play a crucial role in helping individuals achieve their goals, overcome challenges, and maximise their potential.\n\n\n\nYou'll be\n* Defining the data requirements from the research trials that the core Science team will run alongside data coming from the main product experience.\n* Automating data collection from a variety of sources (e.g. labs, questionnaires, study coordination tools).orchestrating integration of data derived from these trials into our data warehouse. \n* Coordinating with different product teams to ensure a seamless App experience for both study participants and paid customers. \n* Ensuring consistency and accuracy of all study data used for research and product development. \n* Conducting exploratory data analysis to understand data patterns and trends. \n* Creating algorithms and ML models when necessary. \n* Ensuring data security and compliance with regulatory standards.\n* Ensuring data accessibility to internal and external stakeholders with up-to-date documentation on data sources and schemes. \n\n\n\nWe think youโll be great if you...\n* +6 years of experience in data engineering roles, with a proven track record of working on data integration, ETL processes, and data warehousing \n* Are proficient in Python and SQL and have experience with Data Warehouses (ex: BigQuery, SnowFlake) and interactive computing environments like Jupyter Notebooks.\n* Have knowledge of data governance principles and best practices for ensuring data quality, security, and compliance with regulatory standards.\n* Are detail-oriented and data-savvy to ensure the accuracy and reliability of the data. \n* Are someone who strives to keep their code clean, tests complete and maintained, and their releases frequent.\n* Have experience with cloud platforms like Google Cloud Platform (GCP) and platforms to schedule and monitor data workflows like Apache Airflow.\n* Have a solid understanding of best practices around CI/CD, containers and what a great release process looks like. \n* Have the ability to collaborate effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders.\n* Have a mindset of collaboration, innovation, and a passion for contributing to groundbreaking scientific discoveries.\n\n\n\nNice to have\n* Have experience with dbt and Apache Airflow. \n* Have experience with ML modelling and ML Ops. \n* Have experience with privacy-preserving technologies such as federated learning and data synthesis. \n\n\n\n\n\nThese are the ideal skills, attributes, and experience weโre looking for in this role. Donโt worry if you donโt tick all the boxes, especially on the skills and experience front, weโre happy to upskill for the right candidate. \n\n\nLife as a ZOEntist โ what you can expect from us:\nAs well as industry-benchmarked compensation and all the hardware and software you need, we offer a thoughtfully-curated list of benefits. We expect this list to evolve as we continue supporting our team membersโ long-term personal and professional growth, and their wellbeing. \n\n\nRemote-first: Work flexibly โ from home, our London office, or anywhere within the EU \nStock options: So you can share in our growth \nPaid time off: 28 days paid leave (25 holiday days, plus 2 company-wide reset days, and 1 โlife eventโ day) \nEnhanced Parental Leave: On top of the statutory offering\nFlexible private healthcare and life assurance options\nPension contribution: Pay monthly or top up โ your choice. \nHealth and wellbeing: Like our Employee Assistance Program and Cycle to Work Scheme\nSocial, WFH, and Growth (L&D) budgets. Plus, multiple opportunities to connect, grow, and socialise \n\n\nWeโre all about equal opportunities \nWe know that a successful team is made up of diverse people, able to be their authentic selves. To continue growing our team in the best way, we believe that equal opportunities matter, so we encourage candidates from any underrepresented backgrounds to apply for this role. You can view our Equal Opportunities statement in full here. \n\n\nA closer look at ZOE \nThink youโve heard our name somewhere before? We were the team behind the COVID Symptom Study, which has since become the ZOE Health Study (ZHS). We use the power of community science to conduct large-scale research from the comfort of contributorsโ own homes. Our collective work and expertise in biology, engineering, and data/nutrition science have led to multiple breakthrough papers in leading scientific journals such as Nature Medicine, Science, The Lancet, and more.\n\n\nSeen ZOE in the media recently? Catch our co-founder Professor Tim Spector (one of the worldโs most cited scientists) and our Chief Scientist Dr Sarah Berry on this BBC Panorama, and listen to CEO Jonathan Wolf unpack the latest in science and nutrition on our ZOE podcast. \n\n\nOh, and if youโre wondering why ZOE? It translates to โLifeโ in Greek, which weโre helping ZOE members enjoy to the fullest. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nUK/EU or compatible timezone (Remote)
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Nielsen and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 2 years ago
Data Science is at the core of Nielsenโs business. Our team of researchers come from diverse disciplines and they drive innovation, new product ideation, experimental design and testing, complex analysis and delivery of data insights around the world. We support all International Media clients and are located where our clients are.\n\nLead Data Scientist - Remoteย -ย 101791\nData Scienceย -ย Remoteย \n\nThe Lead Data Scientistโs primary responsibility in the Audio Data Science team is to develop creative solutions to enhance the data and analysis infrastructure and pipeline which underpins the survey quality for all Nielsen Audio survey products. ย In order to deliver high quality standards, the Data Scientist will work as subject matter expert on a team of analysts to establish, maintain and continuously improve data tools and processes supporting the Audio data science team. ย \nTasks will include developing system enhancements, procedural and technological documentation, working with cross functional teams to implement solutions into production systems, supporting survey methodology enhancement projects, and supporting client facing data requests.\n\nWhat will I do?\nMaintain and continuously improve the variety of data infrastructure, analysis, production and QA processes for the Audio Data Science team\nAssist in the transition of the data science tech infrastructure away from legacy systems and methods\nWork with cross-functional teams to implement and validate enhanced audience measurement methodologies\nBuild and refine data queries from large relational databases/data warehouses/data lakes for various analyses and/or requests\nUtilize tools such as Python, Tableau, AWS, Databricks etc. to independently develop, test and implement high quality custom, modular code to perform complex data analysis, visualizations, and answer client queries\nMaintain and update comprehensive documentation on departmental procedures, checklists and metrics\nImplement prevention and detection controls to ensure data integrity, as well as detect and address quality escapes\nWork closely with internal customers and IT personnel to improve current processes and engineer new methods, frameworks and data pipelines\nWork as an integral member of the Audio Data Science team in a time-critical production environment\nKey tasks include โ but are not limited to โ data integration, data harmonization, automation, examining large volumes of data, identifying & implementing methodological, process & technology improvements\nDevelop and maintain the underlying infrastructure to support forecasting & statistical models, machine learning solutions, big data pipelines (from internal and external sources) used in a production environment\n\nIs this for me?\nUndergraduate or graduate degree in mathematics, statistics, engineering, computer science, economics, business or fields that employ rigorous data analysis\nMust be proficient with Python (and Spark/Scala) to develop sharable software with the appropriate technical documentation\nExperience utilizing Gitlab, Git or similar to manage code development\nExperience utilizing Apache Spark, Databricks & Airflow\nExpertize with Tableau, or other data visualization software and techniques\nExperience in containerization such as Docker and/or Kubernetes\nExpertize in querying large datasets with SQL and of working with Oracle, Netezza, Data Warehouse and Data Lake data structures\nExperience in leveraging CI/CD pipelines\nExperience utilizing cloud computing platforms such as AWS, Azure, etc\nStrong ability to proactively gather information, work independently as well as within an multi disciplinary team\nE- Proficiency in MS Office suite (Excel, Access, PowerPoint and Word) and/or Google Office Apps (Sheets, Docs, Slides, Gmail)\n\nPreferred\nKnowledge of machine learning and data modeling techniques such as Time Series, Decision Trees, Random Forests, SVM, Neural Networks, Incremental Response Modeling, and Credit Scoring\nKnowledge of survey sampling methodologies\nKnowledge of statistical tests and procedures such as ANOVA, Chi-squared, Correlation, Regression, etc\n#LI-SF1\n\nABOUT NIELSEN\nAs the arbiter of truth, Nielsen Global Media fuels the media industry with unbiased, reliable data about what people watch and listen to. To discover whatโs true, we measure across all channels and platformsโ โfrom podcasts to streaming TV to social media. And when companies and advertisers are armed with the truth, they have a deeper understanding of their audiences and can accelerate growth.ย \nDo you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and take action. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. Youโll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work!ย ย \n\nNielsen is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Data Science, Executive, Cloud, Git, Python, Engineer and Apache jobs that are similar:\n\n
$80,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.