Junior SE Position Overview\n\nWe are looking for a SingleStore Solutions Engineer who is passionate about removing data bottlenecks for their customers and enabling real-time data capabilities to some of the most difficult data challenges in the industry. In this role you will work directly with our sales teams, and channel partners to identify prospective and current customer pain points where SingleStore can remove those bottlenecks and deliver real-time capabilities. You will provide value-based demonstrations, presentations, and support proof of concepts to validate proposed solutions.\n\nAs a SingleStore solutions engineer, you must share our passion for real-time data, fast analytics, and simplified data architecture. You must be comfortable in both high executive conversations as well as being able to deeply understand the technology and its value-proposition.\nAbout our Team\n\nAt SingleStore, the Solutions Engineer team epitomizes a dynamic blend of innovation, expertise, and a fervent commitment to meeting complex data challenges head-on. This team is composed of highly skilled individuals who are not just adept at working with the latest technologies but are also instrumental in ensuring that SingleStore is the perfect fit for our customers.\n\nOur team thrives on collaboration and determination, building some of the most cutting-edge deployments of SingleStore data architectures for our most strategic customers. This involves working directly with product management to ensure that our product is not only addressing current data challenges but is also geared up for future advancements.\n\nBeyond the technical prowess, our team culture is rooted in a shared passion for transforming how businesses leverage data. We are a community of forward-thinkers, where each member's contribution is valued in our collective pursuit of excellence. Our approach combines industry-leading engineering, visionary design, and a dedicated customer success ethos to shape the future of database technology. In our team, every challenge is an opportunity for growth, and we support each other in our continuous learning journey. At SingleStore, we're more than a team; we're innovators shaping the real-time data solutions of tomorrow.\nResponsibilities\n\n\n* Engage with both current and prospective clients to understand their technical and business challenges\n\n* Present and demonstrate SingleStore product offering to fortune 500 companies.\n\n* Enthusiastic about the data analytics and data engineering landscape\n\n* Provide valuable feedback to product teams based on client interactions\n\n* Stay up to date with database technologies and the SingleStore product offerings\n\n\n\n\n \nQualifications\n\n\n* Excellent presentation and communication skills, with experience presenting to large corporate organizations\n\n* Ability to communicate complex technical concepts for non-technical audiences.\n\n* Strong team player with interpersonal skills\n\n* Broad range of experience within large-scale database and/or data warehousing technologies\n\n* Experience with data engineering tools Apache Spark, Apache Flink,Apache Airflow\n\n* Demonstrated proficiency in ANSI SQL query languages\n\n* Demonstrated proficiency in Python, Scala or Java\n\n* Understanding of private and public cloud platforms such as AWS, Azure, GCP, VMware\n\n\n\n\n\nSingleStore delivers the cloud-native database with the speed and scale to power the worldโs data-intensive applications. With a distributed SQL database that introduces simplicity to your data architecture by unifying transactions and analytics, SingleStore empowers digital leaders to deliver exceptional, real-time data experiences to their customers. SingleStore is venture-backed and headquartered in San Francisco with offices in Sunnyvale, Raleigh, Seattle, Boston, London, Lisbon, Bangalore, Dublin and Kyiv. \n\nConsistent with our commitment to diversity & inclusion, we value individuals with the ability to work on diverse teams and with a diverse range of people.\n\nPlease note that SingleStore's COVID-19 vaccination policy requires that team members in the United States be up to date with the current CDC guidelines for their vaccinations with one of the United States FDA-approved vaccine options to meet in person for SingleStore business or to work from one of our U.S. office locations. [It is expected that this will be a requirement for this role]. If an exemption and/or accommodation to our vaccination policy is requested, a member of the Human Resources department will be available to begin the interactive accommodation process.\n\nTo all recruitment agencies: SingleStore does not accept agency resumes. Please do not forward resumes to SingleStore employees. SingleStore is not responsible for any fees related to unsolicited resumes and will not pay fees to any third-party agency or company that does not have a signed agreement with the Company.\n\n#li-remote #remote-li \n\nSingleStore values individuals for their unique skills and experiences, and weโre proud to offer roles in a variety of locations across the United States. Salary is based on permissible, non-discriminatory factors such as skills, experience, and geographic location, and is just one part of our total compensation and benefits package. Certain roles are also eligible for additional rewards, including merit increases and annual bonuses. \n\nOur benefits package for this role includes: stock options, flexible paid time off, monthly three-day weekends, 14 weeks of fully-paid gender-neutral parental leave, fertility and adoption assistance, mental health counseling, 401(k) retirement plan, and rich health insurance offeringsโincluding medical, dental, vision and life and disability insurance. \n\nSingleStoreโs base salary range for this role, if based in California, Colorado, Washington, or New York City is: $X - $X USD per year\n\nFor candidates residing in California, please see our California Recruitment Privacy Notice. For candidates residing in the EEA, UK, and Switzerland, please see our EEA, UK, and Swiss Recruitment Privacy Notice.\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Cloud, Junior, Sales and Engineer jobs that are similar:\n\n
$37,500 — $92,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRaleigh, North Carolina, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nVerana Health, a digital health company that delivers quality drug lifecycle and medical practice insights from an exclusive real-world data network, recently secured a $150 million Series E led by Johnson & Johnson Innovation โ JJDC, Inc. (JJDC) and Novo Growth, the growth-stage investment arm of Novo Holdings. \n\nExisting Verana Health investors GV (formerly Google Ventures), Casdin Capital, and Brook Byers also joined the round, as well as notable new investors, including the Merck Global Health Innovation Fund, THVC, and Breyer Capital.\n\nWe are driven to create quality real-world data in ophthalmology, neurology and urology to accelerate quality insights across the drug lifecycle and within medical practices. Additionally, we are driven to advance the quality of care and quality of life for patients. DRIVE defines our internal purpose and is the galvanizing force that helps ground us in a shared corporate culture. DRIVE is: Diversity, Responsibility, Integrity, Voice-of-Customer and End-Results. Click here to read more about our culture and values. \n\n Our headquarters are located in San Francisco and we have additional offices in Knoxville, TN and New York City with employees working remotely in AZ, CA, CO, CT, FL, GA, IL, LA, MA, NC, NJ, NY, OH, OR, PA, TN, TX, UT , VA, WA, WI. All employees are required to have permanent residency in one of these states. Candidates who are willing to relocate are also encouraged to apply. \n\nJob Title: Data Engineer\n\nJob Intro:\n\nAs a Data/Software Engineer at Verana Health, you will be responsible for extending a set of tools used for data pipeline development. You will have strong hands-on experience in design & development of cloud services. Deep understanding of data quality metadata management, data ingestion, and curation. Generate software solutions using Apache Spark, Hive, Presto, and other big data frameworks. Analyzing the systems and requirements to provide the best technical solutions with regard to flexibility, scalability, and reliability of underlying architecture. Document and improve software testing and release processes across the entire data team.\n\nJob Duties and Responsibilities:\n\n\nArchitect, implement, and maintain scalable data architectures to meet data processing and analytics requirements utilizing AWS and Databricks\n\nAbility to troubleshoot complex data issues and optimize pipelines taking into consideration data quality, computation and cost.\n\nCollaborate with cross-functional teams to understand and translate data needed into effective data pipeline solutions\n\nDesign solutions to solving problems related to ingestion and curation of highly variable data structures in a highly concurrent cloud environment.\n\nRetain metadata for tracking of execution details to reproducibility and providing operational metrics.\n\nCreate routines to add observability and alerting to the health of pipelines.\n\nEstablish data quality checks and ensure data integrity and accuracy throughout the data lifecycle.\n\nResearch , perform proof-of-concept and leverage performant database technologies(like Aurora Postgres, Elasticsearch, Redshift) to support end user applications that need sub second response time.\n\nParticipate in code reviews.\n\nProactive in staying updated with industry trends and emerging technologies in data engineering.\n\nDevelopment of data services using RESTful APIโs which are secure(oauth/saml), scalable(containerized using dockers), observable (using monitoring tools like datadog, elk stack), documented using OpenAPI/Swagger by using frameworks in python/java and automated CI/CD deployment using Github actions.\n\nDocument data engineering processes , architectures, and configurations.\n\n\n\n\nBasic Requirements:\n\n\nA minimum of a BS degree in computer science, software engineering, or related scientific discipline.\n\nA minimum of 3 years of experience in software development\n\nStrong programming skills in languages such as Python/Pyspark, SQL\n\nExperience with Delta lake, Unity Catalog, Delta Sharing, Delta live tables(DLT)\n\nExperience with data pipeline orchestration tools - Airflow, Databricks Workflows\n\n1 year of experience working in AWS cloud computing environment, preferably with Lambda, S3, SNS, SQS\n\nUnderstanding of Data Management principles(governance, security, cataloging, life cycle management, privacy, quality)\n\nGood understanding of relational databases.\n\nDemonstrated ability to build software tools in a collaborative, team oriented environment that are product and customer driven.\n\nStrong communication and interpersonal skills\n\nUtilizes source code version control.\n\nHands-on experience with Docker containers and container orchestration.\n\n\n\n\nBonus:\n\n\nHealthcare and medical data experience is a plus.\n\nAdditional experience with modern compiled programming languages (C++, Go, Rust)\n\nExperience building HTTP/REST APIs using popular frameworks\n\nBuilding out extensive automated test suites\n\n\n\n\nBenefits:\n\nWe provide health, vision, and dental coverage for employees\n\n\n\n\n\n\n\nVerana pays 100% of employee insurance coverage and 70% of family\n\nPlus an additional monthly $100 individual / $200 HSA contribution with HDHP\n\n\n\n\n\n\n\n\nSpring Health mental health support\n\nFlexible vacation plans\n\nA generous parental leave policy and family building support through the Carrot app\n\n$500 learning and development budget\n\n$25/wk in Doordash credit\n\nHeadspace meditation app - unlimited access\n\nGympass - 3 free live classes per week + monthly discounts for gyms like Soulcycle\n\n\n\n\nFinal note:\n\nYou do not need to match every listed expectation to apply for this position. Here at Verana, we know that diverse perspectives foster the innovation we need to be successful, and we are committed to building a team that encompasses a variety of backgrounds, experiences, and skills.\n\n \n\n \n\n \n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Docker, Testing, Cloud and Engineer jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nMemora Health works with leading healthcare organizations to make complex care journeys simple for patients and clinicians so that care is more accessible, actionable, and always-on. Our team is rapidly growing as we expand our programs to reach more health systems and patients, and we are excited to bring on a Senior Data Engineer. \n\nIn this role, you will have the responsibility of driving the architecture, design and development of our data warehouse and analytics solutions, alongside APIs that allow other internal teams to interact with our data. The ideal candidate will be able to collaborate effectively with Memoraโs Product Management, Engineering, QA, TechOps and business stakeholders.\n\nThis role will work closely with the cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions. Ideal candidates will be driven not only by the problem we are solving but also by the innovative approach and technology that we are applying to healthcare - looking to make a significant impact on healthcare delivery. Weโre looking for someone with exceptional curiosity and enthusiasm for solving hard problems.\n\n Primary Responsibilities:\n\n\n* Collaborate with Technical Lead, fellow engineers, Product Managers, QA, and TechOps to develop, test, secure, iterate, and scale complex data infrastructure, data models, data pipelines, APIs and application backend functionality.\n\n* Work closely with cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions\n\n* Promote product development best practices, supportability, and code quality, both through leading by example and through mentoring other software engineers\n\n* Manage and pare back technical debts and escalate to Technical Lead and Engineering Manager as needed\n\n* Establish best practices designing, building and maintaining data models.\n\n* Design and develop data models and transformation layers to support reporting, analytics and AI/ML capabilities.\n\n* Develop and maintain solutions to enable self-serve reporting and analytics.\n\n* Build robust, performant ETL/ELT data pipelines.\n\n* Develop data quality monitoring solutions to increase data quality standards and metrics accuracy.\n\n\n\n\nQualifications (Required):\n\n\n* 3+ years experience in shipping, maintaining, and supporting enterprise-grade software products\n\n* 3+ years of data warehousing / analytics engineering\n\n* 3+ years of data modeling experience\n\n* Disciplined in writing readable, testable, and supportable code in JavaScript, TypeScript, Node.js (Express), Python (Flask, Django, or FastAPI), or Java.\n\n* Expertise writing, and consuming RESTful APIs\n\n* Experience with relational or NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis, etc.)\n\n* Experience with Data Warehouses (BigQuery, Snowflake, etc.)\n\n* Experience with analytical and reporting tools, such as Looker or Tableau\n\n* Inclination toward test-driven development and test automation\n\n* Experience with scrum methodology\n\n* Excels in mentoring junior engineers\n\n* B.S. in Computer Science or other quantitative fields or related work experience\n\n\n\n\nQualifications (Bonus):\n\n\n* Understanding of DevOps practices and technologies (Docker, Kubernetes, CI / CD, test coverage and automation, branch and release management)\n\n* Experience with security tooling in SDLC and Security by Design principles\n\n* Experience with observability and APM tooling (Sumo Logic, Splunk, Sentry, New Relic, Datadog, etc.)\n\n* Experience with an integration framework (Mirth Connect, Mule ESB, Apache Nifi, Boomi, etc..)\n\n* Experience with healthcare data interoperability frameworks (FHIR, HL7, CCDA, etc.)\n\n* Experience with healthcare data sources (EHRs, Claims, etc.)\n\n* Experience working at a startup\n\n\n\n\n\n\nWhat You Get:\n\n\n* An opportunity to work on a rapidly scaling care delivery platform, engaging thousands of patients and care team members and growing 2-3x annually\n\n* Enter a highly collaborative environment and work on the fun challenges of scaling a high-growth startup\n\n* Work alongside world-class clinical, operational, and technical teams to build and scale Memora\n\n* Shape how leading health systems and plans think about modernizing the care delivery experience for their patients and care teams\n\n* Improve the way care is delivered for hundreds of thousands of patients\n\n* Gain deep expertise about healthcare transformation and direct customer exposure with the countryโs most innovative health systems and plans\n\n* Ownership over your success and the ability to significantly impact the growth of our company\n\n* Competitive salary and equity compensation with benefits including health, dental, and vision coverage, flexible work hours, paid maternity/paternity leave, bi-annual retreats, Macbook, and a 401(k) plan\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nABOUT THRIVE MARKET \n\n\nThrive Market was founded in 2014 with a mission to make healthy living easy and affordable for everyone. As an online, membership-based market, we deliver the highest quality healthy and sustainable products at member-only prices, while matching every paid membership with a free one for someone in need. Every day, we leverage innovative technology and member-first thinking to help our over 1,000,000+ members find better products, support better brands, and build a better world in the process. We recently reached a significant milestone by becoming a Certified B Corporation, making us the largest grocer to earn this coveted qualification.\n\n\nTHE ROLE\n\n\nThrive Marketโs Data Engineering team is seeking a Senior Data Engineer!\n\n\nWe are looking for a brilliant, dedicated, and hardworking engineer to help us build high-impact products alongside our Data Strategy Team. Our site sees millions of unique visitors every month, and our customer growth currently makes us one of the fastest-growing e-commerce companies in Los Angeles. We are looking for a Senior Data Engineer with hands-on experience working on structured/semi-structured/Complex data processing and streaming frameworks. We need your amazing software engineering skills to help us execute our Data Engineering & Analytics initiatives and turn them into products that will provide great value to our members. In this role, we are hoping to bring someone in who is equally excited about our mission, learning the tech behind the company, and can work cross-functionally with other engineering teams. \n\n\n\nRESPONSIBILITIES\n* Work across multiple projects and efforts to orchestrate and deliver cohesive data engineering solutions in partnership with various functional teams at Thrive Market\n* Be hands-on and take ownership of the complete cycle of data services, from data ingestion, data processing, ETL to data delivery for reporting\n* Collaborate with other technical teams to deliver data solutions which meet business and technical requirements; define technical requirements and implementation details for the underlying data lake, data warehouse and data marts\n* Identify, troubleshoot and resolve production data integrity and performance issues\n* Collaborate with all areas of data management as lead to ensure patterns, decisions, and tooling is implemented in accordance with enterprise standards\n* Perform data source gap analysis and create data source/target catalogs and mappings\n* Develop a thorough knowledge and understanding of cross system integration, interactions and relationships in order to develop an enterprise view of Thrive Marketโs future data needs\n* Design, coordinate and execute pilots/prototypes/POC to provide validation on specific scenarios and provide implementation roadmap \n* Recommend/Ensure technical functionality (e.g. scalability, security, performance, data recovery, reliability, etc.) for Data Engineering\n* Facilitate workshops to define requirements and develop data solution designs\n* Apply enterprise and solution architecture decisions to data architecture frameworks and data models\n* Maintain a repository of all data architecture artifacts and procedures\n* Collaborate with IT teams, software providers and business owners to predict and devise data architecture that addresses business needs for collection, aggregation and interaction with multiple data streams\n\n\n\nQUALIFICATIONS\n* Hands on experience programming in Python, Scala or Java\n* Expertise with RDBMS and Data Warehousing (Strong SQL) with Redshift,Snowflake or similar\n* In-depth knowledge and experience with data and information architecture patterns and implementation approaches for Operational Data Stores, Data Warehouses, Data Marts and Data Lakes\n* Proficiency in logical/physical data architecture, design and development\n* Experience in Data lake / Big data analytics platform implementation either cloud based or on-premise; AWS preferred \n* Experience working with high volumes of data; experience in design, implementation and support of highly distributed data applications \n* Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration E.g. GitHub, Jenkins, Concourse, Airflow, Terraform\n* Experience with writing Kafka producers and consumers or experience with AWS Kinesis \n* Hands-on experience developing a distributed data processing platform with Big Data technologies like Hadoop, Spark etc\n* A knack for independence (hands-on) as well as team work\n* Excellent analytical and problem-solving skills, often in light of ill-defined issues or conflicting information.\n* Experience with streaming data ingestion, machine-learning, Apache Spark a plus\n* Adept in the ability to elicit, gather, and manage requirements in an Agile delivery environment\n* Excellent communication and presentation skills (verbal, written, presentation) across all levels of the organization. Ability to translate ambiguous concepts into tangible ideas.\n\n\n\nBELONG TO A BETTER COMPANY\n* Comprehensive health benefits (medical, dental, vision, life and disability)\n* Competitive salary (DOE) + equity\n* 401k plan\n* Flexible Paid Time Off\n* Subsidized ClassPass Membership with access to fitness classes and wellness and beauty experiences\n* Ability to work in our beautiful co-working space at WeWork in Playa Vista and other locations\n* Dog-Friendly Office\n* Free Thrive Market membership and discount on private label products\n* Coverage for Life Coaching & Therapy Sessions on our holistic mental health and well-being platform\n\n\nWe're a community of more than 1 Million + members who are united by a singular belief: It should be easy to find better products, support better brands, make better choices, and build a better world in the process.\nFeeling intimidated or hesitant about applying because you donโt meet every requirement? Studies prove that women and people of color are less likely to apply for jobs if they do not meet every single qualification. At Thrive Market, we believe in building a diverse, inclusive, and authentic culture. If you are excited about this role along with our mission and values, we sincerely encourage you to apply anyways! As the great Los Angeles King Wayne Gretzky said, โYou miss 100% of the shots you donโt take.โ Take the shot! \nThrive Market is an EOE/Veterans/Disabled/LGBTQ employer\nAt Thrive Market, our goal is to be a diverse and inclusive workplace that is representative, at all job levels, of the members we serve and the communities we operate in. Weโre proud to be an inclusive company and an Equal Opportunity Employer and we prohibit discrimination and harassment of any kind. We believe that diversity and inclusion among our teammates is critical to our success as a company, and we seek to recruit, develop and retain the most talented people from a diverse candidate pool. If youโre thinking about joining our team, we expect that you would agree!\nIf you need assistance or accommodation due to a disability, please email us at [email protected] and weโll be happy to assist you.\nEnsure your Thrive Market job offer is legitimate and don't fall victim to fraud. Thrive Market never seeks payment from job applicants. Thrive Market recruiters will only reach out to applicants from an @thrivemarket.com email address. For added security, where possible, apply through our company website at www.thrivemarket.com.\nยฉ Thrive Market 2023 All rights reserved.\n\nJOB INFORMATION\n* Compensation Description - The base salary range for this position is $160,000 - $190,000/Per Year. \n* Compensation may vary outside of this range depending on several factors, including a candidateโs qualifications, skills, competencies and experience, and geographic location. \n* Total Compensation includes Base Salary, Stock Options, Health & Wellness Benefits, Flexible PTO, and more! \n\n\n\n\n\n#LI-DR1 \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Cloud, Scala, Senior and Engineer jobs that are similar:\n\n
$55,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nLos Angeles or Remote
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\n\n\n \n\nWho are Tide:\n\nAt Tide, weโre on a mission to save businesses time and money. Weโre the leading provider of UK SME business accounts and one of the fastest-growing FinTechs in the UK. Using the latest tech, we design solutions with SMEs in mind and our member-driven financial platform is transforming the business banking market. Not only do we offer our members business accounts and related banking services, but also a comprehensive set of highly connected admin tools for businesses. \n\nTide is about doing what you love. Weโre looking for someone to join us on our exciting scale up journey and be a part of something special. We are wanting passionate Tideans to drive innovation and help build a best-in-class platform to support our members. You will be comfortable in ambiguous situations and will be able to navigate the evolving FinTech environment. Imagine shaping how millions of Tide members discover and engage with business banking platforms and building this on a global scale.\n\nWhat weโre looking for:\n\nAs part of the team, you will be responsible for building and running the data pipelines andservices that are required to support business functions/reports/dashboard.. We are heavily dependent on Snowflake, Airflow, Fivetran, dbt , Looker for our business intelligence andembrace AWS as a key partner across our engineering teams\n\nAs a Analytics Engineer youโll be:\n\n\n* Developing end to end ETL/ELT Pipeline working with Data Analyst of business function.\n\n* Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture\n\n* Mentoring other Junior Engineers in Team\n\n* Be a โgo-toโ expert for data technologies and solutions\n\n* Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges\n\n* Troubleshooting and resolving technical issues as they arise\n\n* Looking for ways of improving both what and how data pipelines are delivered by the department\n\n* Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports owning the delivery of data models and reports end to end\n\n* Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future\n\n* Working with Data Analyst to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other โdelta loadingโ approaches\n\n* Discovering, transforming, testing, deploying and documenting data sources\n\n* Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review\n\n* Building Looker Dashboard for use cases if required\n\n\n\n\nWhat makes you a great fit: \n\n\n* You have 7+ years of extensive development experience using snowflake or similar data warehouse technology\n\n* You have working experience with dbt and other technologies of the modern datastack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker\n\n* You have experience in agile processes, such as SCRUM\n\n* You have extensive experience in writing advanced SQL statements and performance tuning them\n\n* You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran\n\n* You have experience in data modelling and can optimise existing/new data models\n\n* You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets\n\n* You having experience architecting analytical databases (in Data Mesh architecture) is added advantage\n\n* You have experience working in agile cross-functional delivery team\n\n* You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment\n\n\n\n\nWhat youโll get in return: \n\nMake work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, youโll get:\n\n\nCompetitive salary\n\nSelf & Family Health Insurance\n\nTerm & Life Insurance\n\nOPD Benefits\n\nMental wellbeing through Plumm\n\nLearning & Development Budget\n\nWFH Setup allowance\n\n15 days of Privilege leaves\n\n12 days of Casual leaves\n\n12 days of Sick leaves\n\n3 paid days off for volunteering or L&D activities\n\n\n\n\nTidean Ways of Working \n\nAt Tide, weโre Member First and Data Driven, but above all, weโre One Team. Our Working Out of Office (WOO) policy allows you to work from anywhere in the world for up to 90 days a year. We are remote first, but when you do want to meet new people, collaborate with your team or simply hang out with your colleagues, our offices are always available and equipped to the highest standard. We offer flexible working hours and trust our employees to do their work well, at times that suit them and their team.\n\nTide is a place for everyone\n\nAt Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity status or disability status. We believe itโs what makes us awesome at solving problems! We are One Team and foster a transparent and inclusive environment, where everyoneโs voice is heard.\n\n#LI-NN1 #LI-Remote\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, SaaS, Git and Engineer jobs that are similar:\n\n
$55,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nBangalore, Karnataka, India
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.