The Role\n\nLeafLink is seeking a Principal Data Engineer to join our remote-friendly team, headquartered in NYC, who is passionate about working with teams that solve interesting, large-scale problems rapidly. This impactful position enables LeafLink to coordinate and integrate with 3rd party data sets and proprietary data to produce valuable insights into business and customer needs. As a member of our engineering team, you will be in a position to have a direct and lasting impact everywhere in the company. Your contribution will be immediate and have positive ripple effects across not just our business, but also the business of each of our customers. \n\nLeafLink is currently tackling a large-scale platform overhaul that will strengthen our position as a technical leader within the industry. As such, this role has the opportunity to help lead, shape, and grow the data and machine learning architecture within our platform, as well as work with new and growing technologies. Itโs a very exciting time to join our engineering team!\n\nIdeal candidates for this position should possess a keen mind for solving tough problems with the ideal solution, partnering effectively with various team members along the way. They should be deeply passionate about organizing and managing data at scale for various use cases. They should be personable, efficient, flexible, and communicative, have a strong desire to implement change, grow, mature, and have a passion and love for their work. This role comes with the opportunity to be a high performer within a fast-paced, dynamic, and quickly growing department in all areas.\nWhat Youโll Be Doing\n\n\n* Audit, design, and maintain a high-performing, modular, and optimal data pipeline architecture for structured and unstructured use cases around machine learning, reporting, and analytics \n\n* Design and co-build with Cloud and DevOps the infrastructure and operations required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, python, and AWS cloud technologies\n\n* Keep up to date on modern technologies and trends and advocate for their inclusion within products when it makes sense\n\n* Analyze and evaluate existing solutions and make decisions on whether to extend or refactor as needed with a major focus on improving our pipeline and reporting performance\n\n* Work with the CTO and department stakeholders to properly plan short and long-term goals, and define and execute a technical roadmap that continues to evolve LeafLinkโs data capabilities and functionality to meet the needs of our Business and Product Vision.\n\n* Work collaboratively with multiple cross-functional agile teams to help deliver end-to-end products and features enabled by our data pipeline, seeing them through from conception to delivery\n\n* Help define, document, evolve, and evangelize high engineering standards, best practices, tenants, and data management & governance across data and analytics engineering\n\n* Move quickly and intelligently - seeing technical debt as your nemesis and eliminating risk\n\n* Effectively communicate the complexity of your work to technical and non-technical audiences through non-written and written mediums\n\n* Design, develop, and test data models in our data warehouse that enable data and analytics processes\n\n* Help define and build our enterprise data catalog and dictionary \n\n* Troubleshoot, diagnose and address data quality issues quickly and effectively while implementing solutions to combat this at scale, including improved quality controls and observability and monitoring\n\n* Provide mentorship and growth to our BE and Data engineers while creating repeatable and scalable solutions and patterns\n\n\n\nWhat Youโll Bring to the Team \n\n\n* Minimum of 10 years experience in a professional working environment on a data or engineering team\n\n* Advanced working SQL knowledge and experience working with relational and non-relational databases, query authoring (SQL) as well as working familiarity with a variety of data stores\n\n* Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.\n\n* Expertise writing Python processing jobs to ingest a variety of structured and unstructured data received from various sources & formats such as Rest APIs, Flat Files, and Logs with the ability to support and scale to both smaller and larger dataset ingestions \n\n* They should also have experience using the following software/tools:\n\n\n\n* Experience with object-oriented/object function scripting in Python and data processing libraries such as requests, pandas, sqlalchemy\n\n* Experience with relational SQL and NoSQL databases, such as Redshift or comparable cloud-based OLAP databases such as Snowflake\n\n* Experience with data pipeline and workflow management tools: Airflow\n\n* Experience with AWS cloud services\n\n* Hands-on experience with technologies such as Dynamo, Terraform, Kubernetes, Fivetran, and dbt is a strong plus\n\n* Experience with designing and implementing machine learning enablement tools and infrastructure\n\n* Experience leveraging API-based LLM models, dynamic prompt generation, fine-tuning\n\n\n\n* Comfortable working in a fast-paced growth business with many collaborators and quickly evolving business needs\n\n* Individual contributor leadership to our data and analytics engineers and specialization on our current Platform Engineering team around data enterprise architecture and best practices\n\n* Consistency and standards to how we visualize and use our enterprise data at LeafLink through helping us define our first Data Dictionary and Catalog \n\n\n\nLeafLink Perks & Benefits\n\n\n* Flexible PTO - youโre going to be working hard so enjoy time off with no cap!\n\n* A robust stock option plan to give our employees a direct stake in LeafLinkโs success\n\n* 5 Days of Volunteer Time Off (VTO) - giving back is important to us and we want our employees to prioritize cultivating a better community\n\n* Competitive compensation and 401k match\n\n* Comprehensive health coverage (medical, dental, vision)\n\n* Commuter Benefits through our Flexible Spending Account\n\n\n\n\nLeafLinkโs employee-centric culture has earned us a coveted spot on BuiltInNYCโs Best Places to Work for in 2021 list. Learn more about LeafLinkโs history and the path to our First Billion in Wholesale Cannabis Orders here. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, DevOps, Cloud, NoSQL and Engineer jobs that are similar:\n\n
$60,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nNew York City, New York, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Remote Lead Software Development Engineer Test Investigator
\nBy making evidence the heart of security, we help customers stay ahead of ever-changing cyber-attacks. \n\nCorelight is a cybersecurity company that transforms network and cloud activity into evidence. Evidence that elite defenders use to proactively hunt for threats, accelerate response to cyber incidents, gain complete network visibility and create powerful analytics using machine-learning and behavioral analysis tools. Easily deployed, and available in traditional and SaaS-based formats, Corelight is the fastest-growing Network Detection and Response (NDR) platform in the industry. And we are the only NDR platform that leverages the power of Open Source projects in addition to our own technology to deliver Intrusion Detection (IDS), Network Security Monitoring (NSM), and Smart PCAP solutions. We sell to some of the most sensitive, mission critical large enterprises and government agencies in the world.\n\nAs the Lead Software Development Engineer in Test (SDET), you will play a pivotal role in ensuring the quality and reliability of our software products by leading the development of automated testing frameworks for application and performance testing. Your expertise in Python, AWS, and knowledge of network security and Zeek will be instrumental in designing, building, and maintaining robust testing solutions. Additionally, you will be responsible for diagnosing and resolving production issues efficiently to minimize downtime and ensure a seamless user experience.\n\n\nResponsibilities\n\n\n* Lead the design, development, and implementation of automated testing frameworks for application and performance testing.\n\n* Collaborate with cross-functional teams to define testing strategies and requirements, ensuring comprehensive test coverage.\n\n* Utilize your proficiency in Python programming language to develop and maintain test scripts, ensuring the accuracy and reliability of automated tests.\n\n* Leverage AWS services and resources to optimize test environments and infrastructure for scalability, reliability, and efficiency.\n\n* Apply your knowledge of network security principles and technologies to implement effective security testing strategies.\n\n* Diagnose and troubleshoot production issues promptly to identify root causes and minimize downtime.\n\n* Provide timely fixes and patches to address production issues, collaborating with development and operations teams as needed.\n\n* Drive the continuous improvement of testing processes, tools, and methodologies to enhance efficiency and effectiveness.\n\n* Stay updated on industry best practices, emerging technologies, and trends in software testing and development.\n\n* Provide technical leadership and guidance to the testing team, mentoring junior members and fostering a culture of excellence.\n\n\n\n\nMinimum Qualifications\n\n\n* Strong appreciation and support for our core values: low ego results, tireless service, and applied curiosity.\n\n* Proven experience (6+ years) in software development and testing, with a focus on automation.\n\n* Proficiency in Python programming language.\n\n* Strong knowledge and hands-on experience with AWS services and cloud infrastructure.\n\n* Familiarity with performance testing tools and methodologies.\n\n\n\n\nPreferred Qualifications\n\n\n* Experience using Docker, Kubernetes, and containerized microservices.\n\n* Knowledge of Relational and NoSQL databases.\n\n* Experience adopting & using Agile development methodologies\n\n* Excellent communication skills. You thrive by collaborating with multiple teams and use your communication skills to influence product directions.\n\n* Bachelor's degree in Computer Science or related fields, or equivalent experience\n\n\n\n\nWe are proud of our culture and values - driving diversity of background and thought, low-ego results, applied curiosity and tireless service to our customers and community. Corelight is committed to a geographically dispersed yet connected employee base with employees working from home and office locations around the world. Fueled by an accelerating revenue stream, and investments from top-tier venture capital organizations such as Crowdstrike, Accel and Insight - we are rapidly expanding our team. \n\nCheck us out at www.corelight.com\n\nNotice of Pay Transparency:\nThe compensation for this position ranges from $180,000 - $218,000/year and may vary depending on factors such as your location, skills and experience. Depending on the nature and seniority of the role, a percentage of compensation may come in the form of a commission-based or discretionary bonus. Equity and additional benefits will also be awarded. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Testing, Cloud, NoSQL, Junior and Engineer jobs that are similar:\n\n
$60,000 — $97,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nMemora Health works with leading healthcare organizations to make complex care journeys simple for patients and clinicians so that care is more accessible, actionable, and always-on. Our team is rapidly growing as we expand our programs to reach more health systems and patients, and we are excited to bring on a Senior Data Engineer. \n\nIn this role, you will have the responsibility of driving the architecture, design and development of our data warehouse and analytics solutions, alongside APIs that allow other internal teams to interact with our data. The ideal candidate will be able to collaborate effectively with Memoraโs Product Management, Engineering, QA, TechOps and business stakeholders.\n\nThis role will work closely with the cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions. Ideal candidates will be driven not only by the problem we are solving but also by the innovative approach and technology that we are applying to healthcare - looking to make a significant impact on healthcare delivery. Weโre looking for someone with exceptional curiosity and enthusiasm for solving hard problems.\n\n Primary Responsibilities:\n\n\n* Collaborate with Technical Lead, fellow engineers, Product Managers, QA, and TechOps to develop, test, secure, iterate, and scale complex data infrastructure, data models, data pipelines, APIs and application backend functionality.\n\n* Work closely with cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions\n\n* Promote product development best practices, supportability, and code quality, both through leading by example and through mentoring other software engineers\n\n* Manage and pare back technical debts and escalate to Technical Lead and Engineering Manager as needed\n\n* Establish best practices designing, building and maintaining data models.\n\n* Design and develop data models and transformation layers to support reporting, analytics and AI/ML capabilities.\n\n* Develop and maintain solutions to enable self-serve reporting and analytics.\n\n* Build robust, performant ETL/ELT data pipelines.\n\n* Develop data quality monitoring solutions to increase data quality standards and metrics accuracy.\n\n\n\n\nQualifications (Required):\n\n\n* 3+ years experience in shipping, maintaining, and supporting enterprise-grade software products\n\n* 3+ years of data warehousing / analytics engineering\n\n* 3+ years of data modeling experience\n\n* Disciplined in writing readable, testable, and supportable code in JavaScript, TypeScript, Node.js (Express), Python (Flask, Django, or FastAPI), or Java.\n\n* Expertise writing, and consuming RESTful APIs\n\n* Experience with relational or NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis, etc.)\n\n* Experience with Data Warehouses (BigQuery, Snowflake, etc.)\n\n* Experience with analytical and reporting tools, such as Looker or Tableau\n\n* Inclination toward test-driven development and test automation\n\n* Experience with scrum methodology\n\n* Excels in mentoring junior engineers\n\n* B.S. in Computer Science or other quantitative fields or related work experience\n\n\n\n\nQualifications (Bonus):\n\n\n* Understanding of DevOps practices and technologies (Docker, Kubernetes, CI / CD, test coverage and automation, branch and release management)\n\n* Experience with security tooling in SDLC and Security by Design principles\n\n* Experience with observability and APM tooling (Sumo Logic, Splunk, Sentry, New Relic, Datadog, etc.)\n\n* Experience with an integration framework (Mirth Connect, Mule ESB, Apache Nifi, Boomi, etc..)\n\n* Experience with healthcare data interoperability frameworks (FHIR, HL7, CCDA, etc.)\n\n* Experience with healthcare data sources (EHRs, Claims, etc.)\n\n* Experience working at a startup\n\n\n\n\n\n\nWhat You Get:\n\n\n* An opportunity to work on a rapidly scaling care delivery platform, engaging thousands of patients and care team members and growing 2-3x annually\n\n* Enter a highly collaborative environment and work on the fun challenges of scaling a high-growth startup\n\n* Work alongside world-class clinical, operational, and technical teams to build and scale Memora\n\n* Shape how leading health systems and plans think about modernizing the care delivery experience for their patients and care teams\n\n* Improve the way care is delivered for hundreds of thousands of patients\n\n* Gain deep expertise about healthcare transformation and direct customer exposure with the countryโs most innovative health systems and plans\n\n* Ownership over your success and the ability to significantly impact the growth of our company\n\n* Competitive salary and equity compensation with benefits including health, dental, and vision coverage, flexible work hours, paid maternity/paternity leave, bi-annual retreats, Macbook, and a 401(k) plan\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.