\nMaintainX is the world leading mobile-first workflow management platform for industrial and frontline workers. We are a modern IoT enabled cloud based tool for maintenance, safety, and operations on equipment and facilities. MaintainX powers operational excellence for 7,500+ businesses including Duracell, Univar Solutions Inc., Titan America, McDonalds, Brenntag, Cintas, Michaels, and Shell.\n\nWe've raised $104 million in venture capital following a recent Series C funding round led by Bain Capital Ventures, Bessemer Ventures, August Capital, Amity Ventures, Ridge Ventures as well as CEOโs from GE, Twilio, Toast and PagerDuty.\n\nThis role is remote in Canada or USA.\n\nAs a Data Analyst, you will use your expertise with data analysis to uncover valuable insights in order to inform product strategy at MaintainX. You will make use of data sets across different platforms to understand product health and feature performance, and partner with Product Managers to help them understand the opportunities to improve their workstreams.\n\nWhat youโll do:\n\n\n* Guide product strategy through analyses on the performance of product features and experiments.\n\n* Use internal and external data sources to identify opportunities for product development improvements.\n\n* Work with Product team partners to inform goal setting and help implement monitoring such as reporting dashboards.\n\n* Present results to Product team partners and collaborators.\n\n\n\n\nAbout you:\n\n\n* Bachelor's degree in Statistics, Mathematics, Economics, Engineering, Computer Science, or a similar field.\n\n* 4+ years professional experience in the SaaS industry analyzing large datasets, visualizing information, and informing strategy.\n\n* Experience in statistical modelling with at least one statistical software package (e.g. Python, R)\n\n* Strong analytical skill set capable of defining problems and developing innovative solutions utilizing appropriate methods and frameworks.\n\n* Expertise in creating reports and presentations for non-technical audiences.\n\n* Experience with product analytics platforms (e.g Google Analytics, Amplitude, etc.)\n\n* Proficiency with SQL, Python, or similar language with the ability to learn additional analytics tools.\n\n* Proficiency with data visualization (Tableau).\n\n\n\n\nWhatโs in it for you:\n\n\n* Competitive salary and meaningful equity opportunities.\n\n* Healthcare, dental, and vision coverage.\n\n* 401(k) / RRSP enrollment program.\n\n* Take what you need PTO.\n\n* A Work Culture where:\n\n\n\n* Youโll work alongside folks across the globe that reflect the MaintainX values, Smart Humble Optimist.\n\n* We believe in meritocracy, where ideas and effort are publicly celebrated.\n\n\n\n\n\n\nAbout us:\n\nOur mission is to make the life of blue-collar workers easier worldwide by creating software that meets their needs and realities. Our product is truly life-changing for 80% of the workforce that doesnโt work behind a desk and needs enterprise-grade software at their fingertips.\n\nMaintainX is committed to creating a diverse environment. All qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS and Cloud jobs that are similar:\n\n
$52,500 — $115,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nCAPE Analytics is the leading provider of geospatial property intelligence. CAPE provides instant property insights for millions of residential and commercial buildings by analyzing high-resolution imagery, property records, and novel data sources using computer vision and machine learning. With a mission to better understand and protect the built environment, CAPE provides property stakeholders with risk-predictive property attributes that are more timely, accurate, and objective than on-site inspections. Comprised of insurance, real estate, and data experts, CAPE is backed by leading venture capital firms and insurance carriers.\n\n\nA BIT ABOUT US\nSince our founding in 2014, CAPE Analytics has used machine learning and computer vision to pioneer a new form of property information, built specifically for the organizations that finance, protect, and invest in our homes and businesses. Our 50+ (and rapidly growing!) clients across insurance and real estate are leading a digital transformation to secure properties and livelihoods in the face of complex trends in housing and climate. \n\n\n\n\nTHE OPPORTUNITY\nCAPEโs insurance solutions have been adopted by leading carriers across the U.S., Canada, and Australia...but we are just getting started. Over the past 8 years, weโve constructed an analytics platform purpose-built for deep learning. On the heels of our recent $44 million Series C financing, weโre growing rapidly. In CAPEโs next phase, weโre setting out to solve a larger share of the problem, leveraging a radically expanded array of input data sources and advanced machine learning technologies. \n\n\nThe Senior Manager, Client Development is responsible for managing existing client relationships, ensuring successful adoption, contract renewals and identifying new business opportunities for the CAPE suite of products within the existing client base. This individual will conduct business from their virtual office and travel to customer/prospect locations, when necessary.\n\n\nThis position carries a highly competitive salary, commission, and stock options structures that reward your efforts monetarily with career growth opportunities readily available based on personal achievement. Continue reading to learn more about the pay range for this position, and CAPEโs compensation philosophy.\n\n\n\n\nThis position is an Individual Contributor role.\nPosition OTE: $200k - $230k. Base salary range listed below.\n\n\n\nWITHIN 1 MONTH, YOUโLL:\n* Work with the sales team and leadership to learn/understand the CAPE products and fully articulate the CAPE value proposition. \n* Understand the tools, technology, and processes deployed across our organization. \n* Frame out account strategies for existing clients with sales leadership team. \n* Learn about insurance deal cycles and the influencing factors that can contribute to CAPE success\n* Meet and understand CAPEโs marketing and sales enablement efforts\n\n\n\nWITHIN 3 MONTHS, YOUโLL:\n* Drive client adoption, discovery and advancement conversations. \n* Make thoughtful recommendations to the sales leadership team on account injection points. Independently manage external meetings in an attempt to advance carriers usage and value from the CAPE solution.\n* Work closely with sales/marketing teams to build and execute account-based marketing plans. \n* Further your understanding of CAPE and the CAPE market impact potential. \n* Understand new CAPE Innovation and develop tailored GTM strategies.\n\n\n\nWITHIN 6 MONTHS, YOUโLL:\n* Have established deep contacts internally and externally. \n* Have become proficient in the tools used to advance deals cycles. Understand the impact of business intelligence and how success can generate new/faster success. \n* Fully understand the CAPE processes and value propositions. \n* Have quantifiable and sustainable success driving carrier relationships. \n\n\n\nTHE SKILL SET\n* 5+ years sales, sales engineering, client success, account management, client training or equivalent experience, preferably with SaaS or data-driven product company\n* Proven success supporting technology adoption, usage, and client engagement\n* Stability โ Proven history of being a consistent, top performer. \n* Expert verbal and written communication skills including the ability to present to executive-level audiences as well as users (both business and technical audience)\n* Domain expertise in SaaS, data-driven products, or experience in the insurance vertical required \n* Proficiency in the understanding of insurance carrier processes. \n* Strong business acumen and ability to build strong client relationships, experience interacting with senior executives across multiple industries. \n* Ability to manage multiple concurrent sales cycles effectively. \n* Familiarity and comfort with Google suite of products, including Slides and Sheets. \n* Experience with CRM tools, such as Salesforce. \n* Ability to troubleshoot and resolve service related issues whether they be business or technical issues\n* Ability to work cooperatively within a team and across the organization matrix to achieve group and organizational goals\n* Strong organizational and problem solving skills\n* Tableau and/or demonstrated mastery in Excel preferred\n* May require local or overnight business travel up to 25%\n\n\n\n\n$120,000 - $138,000 a yearCape Analytics believes in creating a more equitable environment for everyone, and is committed to standing against wage gap disparities that are widened by limited pay transparency. \nPositions at Cape may also include stock options, bonus opportunities, and/or variable incentive pay (commissions) to supplement your base earnings. Additionally, Cape offers top-notch insurance options and competitive benefits- such as unlimited PTO, company outings, remote work capabilities and more! \n\nTHE TEAM\nAs a member of the CAPE sales team, you will work alongside market leaders with a proven track record of influencing and elevating the insurance market. The Senior Manager, Client Development role works across the organization, teaming with marketing, client success, and product to create and implement new and innovative go-to-market strategies. As a member of the CAPE team on the whole, youโll work alongside brilliant, collaborative, eager and enthusiastic team members who want to make an impact across various markets.\n\n\n\n\nWe believe:\n\n\n*Talent is critical, but best when tempered with humility\n*Self-motivation leads to the best outcomes\n*Open, direct communication is a sign of respect\n*Teamwork drives success\n*Having fun together is an important part of the job\n\n\nView our CCPA policy here\n\n\n***CAPE Analytics is an E-verify participant.*** \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS, Senior, Marketing and Sales jobs that are similar:\n\n
$55,000 — $90,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nPlanetScale is the modern MySQL platform. Our products are loved by developers, builders, scalers, creators, and enterprise buyers alike. \n\nCreated by the people who built the infrastructure at YouTube, Instagram, GitHub, and Slack. PlanetScale is a series C start-up with over $100 million in funding raised; and backed by leading investors such as a16z, SignalFire, Insight Partners, and Kleiner Perkins. We are just getting started!\n\nWhy PlanetScale Marketing?\n\nThe Marketing team at PlanetScale drives awareness and adoption of our database platform through storytelling, brand activations, demand generation, community building, and education. We are responsible for transforming how the industry thinks about using and developing alongside databases. We are changing the game, and thatโs no easy feat.\n\nJob Summary\n\nWe are hiring a Developer Community Manager who will be responsible for managing our social channels, creating social content, nurturing and growing our developer community, and engaging with developers throughout various external channels. Our brand is highly technical, educational, credible, and accessible. You should feel comfortable taking existing technical content and repurposing it into easy-to-digest standalone social media posts.\n\nWhatโs the job to be done?\n\n\n* Manage our organic social media channels by creating/maintaining a content calendar, repurposing existing content (blogs, videos, webinars, podcasts, etc), writing copy, and scheduling. You will be in charge of all content that goes out on our Twitter, LinkedIn, Facebook, and Instagram.\n\n* Monitor our social media channels (Facebook, Twitter, LinkedIn, Instagram), the PlanetScale Discord, and other channels like community Slack groups, LinkedIn groups, subreddits, etc., to build a community of followers, route support questions, and increase reach.\n\n* Repurpose existing content to create engaging and educational material fit for each outlet\n\n* Build and maintain intelligence reports on the competitive landscape\n\n* Engage with the community to identify and build relationships with strong community members/leaders\n\n* Ensure that social content meets brand guidelines, overall communication style, and company vision\n\n* Develop strategies to grow the companyโs community/customer base\n\n* Work in conjunction with our Paid media team, Developer Education team, and Campaigns team to increase our campaign exposure and visibility in the market\n\n* Monitor website traffic and customer engagement through metrics and be able to interpret those metrics\n\n\n\n\nThese attributes best describe you:\n\n\n* You are involved in and love interacting with technical communities\n\n* Youโre a life-long learner with an interest in technical topics\n\n* You are data-driven and use analytics to prioritize your efforts and develop strategy\n\n\n\n\nWhat you will need:\n\n\n* 2+ years of managing communities and social channels for a developer audience\n\n* Strong technical knowledge, with a background in engineering or prior experience as a developer advocate preferred\n\n* Knowledge of online channels for developer marketing and marketing best practices for each\n\n* Exemplary communication skills without a fear of over communication. This role will require effective collaboration and coordination across internal and external stakeholders\n\n* Proven experience carrying out marketing efforts, including planning, prioritizing, and implementing strategy\n\n* Highly organized and proficient at managing multiple projects at the same time\n\n* Proficient in spoken and written English.\n\n\n\n\nWhat else will help you be successful:\n\n\n* Experience working in a remote organization\n\n* Previous experience at a database or SaaS company\n\n\n\n\nAt PlanetScale we believe in supporting people to do their best work and thrive no matter the location. Our mission is to build a diverse, equitable, and inclusive company. We strive to build an inclusive environment where all people feel that they are equally respected and valued, whether they are a candidate or an employee. We welcome applicants of any educational background, gender identity and expression, sexual orientation, religion, ethnicity, age, citizenship, socioeconomic status, disability, pregnancy status, and veteran status.\n\nIf you need any accommodations, please inform our Talent Acquisition team upon initial contact. We are happy to accommodate!\n\nTotal Compensation and Pay Transparency\n\nAn employeeโs total compensation consists of base salary + variable comp where appropriate + benefits + equity. A member of our Talent Acquisition team will be happy to answer any further questions when we engage with you to begin the interview process. \n\n\nSalary Range: $110,000 - $140,000\n\n#LI-Recruiter \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS, Education and Marketing jobs that are similar:\n\n
$50,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote, Oregon, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for Splitgraph and want to re-open this job? Use the edit link in the email when you posted the job!
# We're building the Data Platform of the Future\nJoin us if you want to rethink the way organizations interact with data. We are a **developer-first company**, committed to building around open protocols and delivering the best experience possible for data consumers and publishers.\n\nSplitgraph is a **seed-stage, venture-funded startup hiring its initial team**. The two co-founders are looking to grow the team to five or six people. This is an opportunity to make a big impact on an agile team while working closely with the\nfounders.\n\nSplitgraph is a **remote-first organization**. The founders are based in the UK, and the company is incorporated in both USA and UK. Candidates are welcome to apply from any geography. We want to work with the most talented, thoughtful and productive engineers in the world.\n# Open Positions\n**Data Engineers welcome!** The job titles have "Software Engineer" in them, but at Splitgraph there's a lot of overlap \nbetween data and software engineering. We welcome candidates from all engineering backgrounds.\n\n[Senior Software Engineer - Backend (mainly Python)](https://www.notion.so/splitgraph/Senior-Software-Engineer-Backend-2a2f9e278ba347069bf2566950857250)\n\n[Senior Software Engineer - Frontend (mainly TypeScript)](https://www.notion.so/splitgraph/Senior-Software-Engineer-Frontend-6342cd76b0df483a9fd2ab6818070456)\n\nโ [**Apply to Job**](https://4o99daw6ffu.typeform.com/to/ePkNQiDp) โ (same form for both positions)\n\n# What is Splitgraph?\n## **Open Source Toolkit**\n\n[Our open-source product, sgr,](https://www.github.com/splitgraph/splitgraph) is a tool for building, versioning and querying reproducible datasets. It's inspired by Docker and Git, so it feels familiar. And it's powered by PostgreSQL, so it works seamlessly with existing tools in the Postgres ecosystem. Use Splitgraph to package your data into self-contained\ndata images that you can share with other Splitgraph instances.\n\n## **Splitgraph Cloud**\n\nSplitgraph Cloud is a platform for data cataloging, integration and governance. The user can upload data, connect live databases, or "push" versioned snapshots to it. We give them a unified SQL interface to query that data, a catalog to discover and share it, and tools to build/push/pull it.\n\n# Learn More About Us\n\n- Listen to our interview on the [Software Engineering Daily podcast](https://softwareengineeringdaily.com/2020/11/06/splitgraph-data-catalog-and-proxy-with-miles-richardson/)\n\n- Watch our co-founder Artjoms present [Splitgraph at the Bay Area ClickHouse meetup](https://www.youtube.com/watch?v=44CDs7hJTho)\n\n- Read our HN/Reddit posts ([one](https://news.ycombinator.com/item?id=24233948) [two](https://news.ycombinator.com/item?id=23769420) [three](https://news.ycombinator.com/item?id=23627066) [four](https://old.reddit.com/r/datasets/comments/icty0r/we_made_40k_open_government_datasets_queryable/))\n\n- [Read our blog](https://www.splitgraph.com/blog)\n\n- Read the slides from our early (2018) presentations: ["Docker for Data"](https://www.slideshare.net/splitgraph/splitgraph-docker-for-data-119112722), [AHL Meetup](https://www.slideshare.net/splitgraph/splitgraph-ahl-talk)\n\n- [Follow us on Twitter](https://ww.twitter.com/splitgraph)\n\n- [Find us on GitHub](https://www.github.com/splitgraph)\n\n- [Chat with us in our community Discord](https://discord.gg/eFEFRKm)\n\n- Explore the [public data catalog](https://www.splitgraph.com/explore) where we index 40k+ datasets\n\n# How We Work: What's our stack look like?\n\nWe prioritize developer experience and productivity. We resent repetition and inefficiency, and we never hesitate to automate the things that cause us friction. Here's a sampling of the languages and tools we work with:\n\n- **[Python](https://www.python.org/) for the backend.** Our [core open source](https://www.github.com/splitgraph/splitgraph) tech is written in Python (with [a bit of C](https://github.com/splitgraph/Multicorn) to make it more interesting), as well as most of our backend code. The Python code powers everything from authentication routines to database migrations. We use the latest version and tools like [pytest](https://docs.pytest.org/en/stable/), [mypy](https://github.com/python/mypy) and [Poetry](https://python-poetry.org/) to help us write quality software.\n\n- **[TypeScript](https://www.typescriptlang.org/) for the web stack.** We use TypeScript throughout our web stack. On the frontend we use [React](https://reactjs.org/) with [next.js](https://nextjs.org/). For data fetching we use [apollo-client](https://www.apollographql.com/docs/react/) with fully-typed GraphQL queries auto-generated by [graphql-codegen](https://graphql-code-generator.com/) based on the schema that [Postgraphile](https://www.graphile.org/postgraphile) creates by introspecting the database.\n\n- [**PostgreSQL](https://www.postgresql.org/) for the database, because of course.** Splitgraph is a company built around Postgres, so of course we are going to use it for our own database. In fact, we actually have three databases. We have `auth-db` for storing sensitive data, `registry-db` which acts as a [Splitgraph peer](https://www.splitgraph.com/docs/publishing-data/push-data) so users can push Splitgraph images to it using [sgr](https://www.github.com/splitgraph/splitgraph), and `cloud-db` where we store the schemata that Postgraphile uses to autogenerate the GraphQL server.\n\n- [**PL/pgSQL](https://www.postgresql.org/docs/current/plpgsql.html) and [PL/Python](https://www.postgresql.org/docs/current/plpython.html) for stored procedures.** We define a lot of core business logic directly in the database as stored procedures, which are ultimately [exposed by Postgraphile as GraphQL endpoints](https://www.graphile.org/postgraphile/functions/). We find this to be a surprisingly productive way of developing, as it eliminates the need for manually maintaining an API layer between data and code. It presents challenges for testing and maintainability, but we've built tools to help with database migrations and rollbacks, and an end-to-end testing framework that exercises the database routines.\n\n- [**PostgREST](https://postgrest.org/en/v7.0.0/) for auto-generating a REST API for every repository.** We use this excellent library (written in [Haskell](https://www.haskell.org/)) to expose an [OpenAPI](https://github.com/OAI/OpenAPI-Specification)-compatible REST API for every repository on Splitgraph ([example](http://splitgraph.com/mildbyte/complex_dataset/latest/-/api-schema)).\n\n- **Lua ([luajit](https://luajit.org/luajit.html) 5.x), C, and [embedded Python](https://docs.python.org/3/extending/embedding.html) for scripting [PgBouncer](https://www.pgbouncer.org/).** Our main product, the "data delivery network", is a single SQL endpoint where users can query any data on Splitgraph. Really it's a layer of PgBouncer instances orchestrating temporary Postgres databases and proxying queries to them, where we load and cache the data necessary to respond to a query. We've added scripting capabilities to enable things like query rewriting, column masking, authentication, ACL, orchestration, firewalling, etc.\n\n- **[Docker](https://www.docker.com/) for packaging services.** Our CI pipeline builds every commit into about a dozen different Docker images, one for each of our services. A production instance of Splitgraph can be running over 60 different containers (including replicas).\n\n- **[Makefile](https://www.gnu.org/software/make/manual/make.html) and** [docker-compose](https://docs.docker.com/compose/) **for development.** We use [a highly optimized Makefile](https://www.splitgraph.com/blog/makefile) and `docker-compose` so that developers can easily spin-up a stack that mimics production in every way, while keeping it easy to hot reload, run tests, or add new services or configuration.\n\n- **[Nomad](https://www.nomadproject.io/) for deployment and [Terraform](https://www.terraform.io/) for provisioning.** We use Nomad to manage deployments and background tasks. Along with Terraform, we're able to spin up a Splitgraph cluster on AWS, GCP, Scaleway or Azure in just a few minutes.\n\n- **[Airflow](https://airflow.apache.org/) for job orchestration.** We use it to run and monitor jobs that maintain our catalog of [40,000 public datasets](https://www.splitgraph.com/blog/40k-sql-datasets), or ingest other public data into Splitgraph.\n\n- **[Grafana](https://grafana.com/), [Prometheus](https://prometheus.io/), [ElasticSearch](https://www.elastic.co/), and [Kibana](https://www.elastic.co/kibana) for monitoring and metrics.** We believe it's important to self-host fundamental infrastructure like our monitoring stack. We use this to keep tabs on important metrics and the health of all Splitgraph deployments.\n\n- **[Mattermost](https://mattermost.com/) for company chat.** We think it's absolutely bonkers to pay a company like Slack to hold your company communication hostage. That's why we self-host an instance of Mattermost for our internal chat. And of course, we can deploy it and update it with Terraform.\n\n- **[Matomo](https://matomo.org/) for web analytics.** We take privacy seriously, and we try to avoid including any third party scripts on our web pages (currently we include zero). We self-host our analytics because we don't want to share our user data with third parties.\n\n- **[Metabase](https://www.metabase.com/) and [Splitgraph](https://www.splitgraph.com) for BI and [dogfooding](https://en.wikipedia.org/wiki/Eating_your_own_dog_food)**. We use Metabase as a frontend to a Splitgraph instance that connects to Postgres (our internal databases), MySQL (Matomo's database), and ElasticSearch (where we store logs and DDN analytics). We use this as a chance to dogfood our software and produce fancy charts.\n\n- **The occasional best-of-breed SaaS services** **for organization.** As a privacy-conscious, independent-minded company, we try to avoid SaaS services as much as we can. But we still find ourselves unable to resist some of the better products out there. For organization we use tools like [Zoom](https://www.zoom.us) for video calls, [Miro](https://miro.com/) for brainstorming, [Notion](https://www.notion.so) for documentation (you're on it!), [Airtable for workflow management](https://airtable.com/), [PivotalTracker](https://www.pivotaltracker.com/) for ticketing, and [GitLab for dev-ops and CI](https://about.gitlab.com/).\n\n- **Other fun technologies** including [HAProxy](http://www.haproxy.org/), [OpenResty](https://openresty.org/en/), [Varnish](https://varnish-cache.org/), and bash. We don't touch them much because they do their job well and rarely break.\n\n# Life at Splitgraph\n**We are a young company building the initial team.** As an early contributor, you'll have a chance to shape our initial mission, growth and company values.\n\n**We think that remote work is the future**, and that's why we're building a remote-first organization. We chat on [Mattermost](https://mattermost.com/) and have video calls on Zoom. We brainstorm with [Miro](https://miro.com/) and organize with [Notion](https://www.notion.so).\n\n**We try not to take ourselves too seriously**, but we are goal-oriented with an ambitious mission.\n\n**We believe that as a small company, we can out-compete incumbents** by thinking from first principles about how organizations interact with data. We are very competitive.\n\n# Benefits\n- Fully remote\n\n- Flexible working hours\n\n- Generous compensation and equity package\n\n- Opportunity to make high-impact contributions to an agile team\n\n# How to Apply? Questions?\n[**Complete the job application**](https://4o99daw6ffu.typeform.com/to/ePkNQiDp)\n\nIf you have any questions or concerns, feel free to email us at [[email protected]](mailto:[email protected]) \n\nPlease mention the words **DESERT SPELL GOWN** when applying to show you read the job post completely (#RMjE2LjczLjIxNi4xMDA=). This is a feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.\n\n \n\n#Location\nWorldwide
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for YouGov and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 3 years ago
\nCrunch.io part of the YouGov PLC are looking for a number of talented Python Developers to join their fully remote teams. \n\nCrunch.io is a market-defining company in the analytics SaaS marketplace. We’ve built a revolutionary platform that transforms our customers’ ability to drive insight from market research and survey data. We offer a complete survey data analysis platform that allows market researchers, analysts, and marketers to collaborate in a secure, cloud-based environment, using a simple, intuitive drag-and-drop interface to prepare, analyze, visualize and deliver survey data and analysis.\n\nQuite simply, Crunch provides the quickest and easiest way for anyone, from CMO to PhD, with zero training, to analyze survey data. Users create tables, charts, graphs and maps. They filter, and slice-and-dice survey data directly in their browser.\n\nTech Stack:\n\nWe currently run our in-house production Python code against Redis, MongoDB, and ElasticSearch services. We proxy API requests through NGINX, load balance with ELBs, and deploy our React web application to AWS CloudFront CDN. Our current CI/CD process is built around GitHub, Jenkins, BlueOcean including unit, integration, and end to end tests and automated system deployments. We deploy to Auto Scaling Groups using Ansible and Cloud-Init.\n\nWhat will I be doing?\n\n\n* Develop performance enhancements and new features in Crunch’s proprietary Python in-memory database.\n\n* Work closely with product managers, sales, and customer success team to understand the system’s functional and non-functional requirements.\n\n* Establish realistic estimates for timelines and ensure that project remains on target to meet deadlines.\n\n* Contribute to code quality through unit testing, integration testing, code review, and system design using Python.\n\n* Assist in diagnosing and fixing system failures quickly when they occur in your area of expertise. This is limited to when the on-call rotation needs a subject matter expert to help troubleshoot an issue.\n\n* Design and implement RESTful API endpoints using the Python programming language.\n\n\n\n\nExperience / Qualifications:\n\n\n* Strong understanding of the software development lifecycle.\n\n* A record of successful delivery of SaaS and cloud-based applications.\n\n* Extensive programming experience using Python as a programming language\n\n* A commitment to producing robust, testable code.\n\n* Experience with data locality problems and caching issues\n\n* Expertise writing Cython or C extensions\n\n* Deep understanding of how a database system works internally (indexing, extents, memory management, concurrency, durability, journal)\n\n* Results-driven, self-motivated and enthusiastic.\n\n* Experience working in a Linux environment\n\n* Experience with client/server architectures\n\n* A keen interest in learning new things.\n\n\n\n\nDesirable Experience:\n\n\n* Expertise with the numpy library\n\n* Experience implementing custom messaging protocols (sequence numbers, ttl, etc)\n\n* Database experience using MongoDB and ElasticSearch\n\n* Bachelor’s Degree in Programming, Computer Science, or Engineering-related field.\n\n* Pytest testing experience\n\n* Design and deployment of Continuous Integration tools (e.g., Jenkins, Bamboo, Travis, etc)\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Developer, Digital Nomad, React, Elasticsearch, C, API, SaaS and Linux jobs that are similar:\n\n
$70,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for YouGov and want to re-open this job? Use the edit link in the email when you posted the job!
๐ค Closed by robot after apply link errored w/ code 404 3 years ago
\nCrunch.io part of the YouGov PLC are looking for a number of talented Python Developers to join their fully remote teams. \n\nCrunch.io is a market-defining company in the analytics SaaS marketplace. We’ve built a revolutionary platform that transforms our customers’ ability to drive insight from market research and survey data. We offer a complete survey data analysis platform that allows market researchers, analysts, and marketers to collaborate in a secure, cloud-based environment, using a simple, intuitive drag-and-drop interface to prepare, analyze, visualize and deliver survey data and analysis.\n\nQuite simply, Crunch provides the quickest and easiest way for anyone, from CMO to PhD, with zero training, to analyze survey data. Users create tables, charts, graphs and maps. They filter, and slice-and-dice survey data directly in their browser.\n\nTech Stack:\n\nWe currently run our in-house production Python code against Redis, MongoDB, and ElasticSearch services. We proxy API requests through NGINX, load balance with ELBs, and deploy our React web application to AWS CloudFront CDN. Our current CI/CD process is built around GitHub, Jenkins, BlueOcean including unit, integration, and end to end tests and automated system deployments. We deploy to Auto Scaling Groups using Ansible and Cloud-Init.\n\nWhat will I be doing?\n\n\n* Develop performance enhancements and new features in Crunch’s proprietary Python in-memory database.\n\n* Work closely with product managers, sales, and customer success team to understand the system’s functional and non-functional requirements.\n\n* Establish realistic estimates for timelines and ensure that project remains on target to meet deadlines.\n\n* Contribute to code quality through unit testing, integration testing, code review, and system design using Python.\n\n* Assist in diagnosing and fixing system failures quickly when they occur in your area of expertise. This is limited to when the on-call rotation needs a subject matter expert to help troubleshoot an issue.\n\n* Design and implement RESTful API endpoints using the Python programming language.\n\n\n\n\nExperience / Qualifications:\n\n\n* Strong understanding of the software development lifecycle.\n\n* A record of successful delivery of SaaS and cloud-based applications.\n\n* Extensive programming experience using Python as a programming language\n\n* A commitment to producing robust, testable code.\n\n* Experience with data locality problems and caching issues\n\n* Expertise writing Cython or C extensions\n\n* Deep understanding of how a database system works internally (indexing, extents, memory management, concurrency, durability, journal)\n\n* Results-driven, self-motivated and enthusiastic.\n\n* Experience working in a Linux environment\n\n* Experience with client/server architectures\n\n* A keen interest in learning new things.\n\n\n\n\nDesirable Experience:\n\n\n* Expertise with the numpy library\n\n* Experience implementing custom messaging protocols (sequence numbers, ttl, etc)\n\n* Database experience using MongoDB and ElasticSearch\n\n* Bachelor’s Degree in Programming, Computer Science, or Engineering-related field.\n\n* Pytest testing experience\n\n* Design and deployment of Continuous Integration tools (e.g., Jenkins, Bamboo, Travis, etc)\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Developer, Digital Nomad, React, Elasticsearch, C, API, SaaS and Linux jobs that are similar:\n\n
$70,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.