\nBackground: Global Fishing Watch is an international, non-profit organization committed to advancing ocean governance through increased transparency. We create and publicly share knowledge about human activity at sea to enable fair and sustainable use of our ocean. Founded in 2015 through a collaboration between Oceana, SkyTruth, and Google, GFW became an independent non-profit organization in 2017. Using cutting-edge technology, we create and publicly share map visualizations, data and analysis tools to enable scientific research and drive a transformation in how we manage our ocean. By 2030, we aim to monitor and map all commercial activity at sea, including all industrial fishing vessels, small-scale fishing activity, all large non-fishing vessels, and all fixed infrastructure such as aquaculture and oil rigs. We also plan to work with intergovernmental organizations and 30 governments around the globe to promote the adoption of transparency more widely and publicly share ocean data to drive better management of marine resources.\n\nThe Position\n\nThe Research and Innovation team at Global Fishing Watch (GFW) connects data science and machine learning experts with the scientific community to produce new datasets, publish impactful research, and empower others to use our data. This team harnesses satellite technology, machine learning, and big data to shed light on some of the most pressing issues facing the ocean.\n\nWe are now working to map the global footprint of commercial activity at sea, including the activity of all ocean-going vessels and fixed infrastructure. This work involves combining deep learning and data fusion techniques with petabytes of satellite imagery (radar and optical), and billions of GPS positions from vessels, mostly from the Automatic Identification System (AIS) and Vessel Monitoring Systems.\n\nThe Machine Learning Engineer will assist with large data pipelines of satellite imagery and help build computer vision models to detect and classify maritime objects in imagery data. The initial focus will be on vessel detection in high-resolution (3 m) PlanetScope optical imagery from Planet Labs, leveraging an existing model architecture developed for Sentinel-2. Subsequent work includes implementing new models to expand the detection capability to offshore infrastructure using new satellite imagery sources. The candidate will also collaborate closely with other members of the Research and Innovation team to correlate detected vessels (position, time and length) to vessels tracked by AIS. Finally, the candidate will work closely with the GFW Engineering and Product teams to ensure solutions are compatible and scalable within our cloud infrastructure. \n\nThe incumbent will gain experience working with leading researchers in the field and will interface daily with GFWโs team of data scientists and machine learning experts. They will develop further technical skills in programming, big data, and cloud computing while working for a globally diverse and fully distributed organization. The successful candidate will be organized and excited to help Global Fishing Watch develop strong partnerships and cutting-edge research. \n\nPrincipal Duties and Responsibilities\n\nModel development for small object detection\n\n\n* Design, train, and evaluate computer vision models for object detection in satellite imagery, with an emphasis on vessel detection in optical imagery \n\n* Implement preprocessing pipelines to obtain imagery and prepare it for annotation and modelling \n\n* Devise annotation strategies and tools for labelling vessels and fixed infrastructure in satellite images\n\n* Improve our training datasets and build new training datasets for other human-made objects, potentially managing external annotation services\n\n\n\n\nAdditional tasks may include\n\n\n* Provide technical support to the senior machine learning engineer(s) responsible for developing and advancing other Global Fishing Watch models\n\n* Assist data fusion efforts to integrate detections from multiple sources (e.g. Sentinel-1 SAR and Sentinel-2 optical), accounting for the recall of each model, length of the objects, cloudiness, and image resolution, among others\n\n* Analyze large amounts of data from various sources, such as vessel tracking, identity, and satellite imagery to identify trends, anomalies, and insights\n\n* Ensure the integrity and accuracy of key data pipelines and research BigQuery tables \n\n* Maintain and improve internal Python tools, such as modules and template repositories, to assist with migrating research projects from proof-of-concepts to automated prototypes\n\n* Lead or support eventual research publications and technical blog posts\n\n\n\nCandidate description\n\nSkills you should have\n\n\n* Bachelor's degree and at least four years of professional experience, or an equivalent combination of education and experience, in physical/earth sciences or a related field\n\n* Demonstrated skills and experience with Python\n\n* Strong foundation in mathematics and statistics\n\n* Familiarity working with geospatial data\n\n* Demonstrated experience working with cloud compute platforms and virtualized environments\n\n* Self-motivated with a strong curiosity and desire to learn new skills\n\n* Willingness to take ownership of projects and communicate project updates\n\n* Written and verbal communication skills in English\n\n* Ability to work with a remote team and embrace Slack, Google Suite, Jira, Notion and other collaborative tools\n\n\n\n\nAlso great\n\n\n* Some experience with database query languages such as SQL\n\n* Demonstrated experience with computer vision models\n\n* Demonstrated experience with frameworks such as TensorFlow or PyTorch\n\n* Familiarity with containerization tools like Docker and execution of models inside them\n\n* An appreciation for the complexities and rewards of collaborating in a remote, global and inclusive environment\n\n* Experience engaging with academic researchers and the peer-review process\n\n* Awareness of ethical considerations related to privacy and bias in satellite imagery analysis\n\n\n\n\nThe successful candidate will meet most, but not necessarily all, of the criteria above. Although it is obviously helpful, we do not expect that you already have a deep knowledge of building models or our key programming languages; we do expect that you have the aptitude to develop these skills and knowledge, and that you are excited about revealing human activity across the global ocean using these tools. If you donโt think you check all the boxes, but believe you have unique skills that make you a great fit for the role, we want to hear from you!\n\nAdditional Information\n\nReporting to: Senior Data Scientist / Senior Data Science Manager\n\nManages: NA\n\nLocation: Remote - we welcome candidates based in any country\n\nTerm: Permanent position\n\nFT/PT: Full-time\n\nRecruiting process\n\nA cover letter along with a CV will be requested to see how your experience and interest connect to the position. We expect the cover letter to explain details on how your skills, interests, and aspirations align with the role. If selected for consideration, the hiring process for this position will include a formal 45 minute interview with 2-3 staff followed by a 30 minute administrative screening by a Human Resources manager. Candidates advancing beyond this round will be asked to take a technical assessment. Lastly, an informal 30 minute call with 3-4 members of the Research and Innovation team will be held with finalists.\n\nPlease apply by January 26, 2024\n\nWorking Hours: Global Fishing Watch supports flexible working, so the pattern of hours may vary according to operational and personal needs. The position will be part of a global team spanning many different time zones and so the candidate should be able to accommodate semi-regular early/late meetings to be able to work effectively. Weekend work may be required on occasion. The post holder may be required to undertake regional and international travel. No overtime is payable.\n\nCompensation: A compensation range for this position is US$ 90,000-$110,000 for US-based employees - For applicants located outside of the US, the pay range will be adjusted to the country of hire. Compensation is commensurate with experience and will vary depending on the hired candidateโs country of residence, in accordance with local laws and regulations. GFW offers pension/retirement, health and other benefits commensurate with similar level GFW employees in the country of employment. The position may be a GFW employee or consultant, depending on the country of residence \n\nEqual opportunities: Global Fishing Watch is an equal opportunities employer. Global Fishing Watch is committed to promoting diversity and inclusion within our organization and in the greater ocean management and conservation community. We believe that diverse backgrounds, skills, knowledge, and viewpoints make us a stronger organization. Bringing together professionals who possess broad experiences and a spectrum of perspectives will enable us to reach our goal of improved ocean governance faster. We hire and promote qualified professionals without regard to actual or perceived race, color, religion or belief, sex, sexual orientation, gender identity, marital, or parental status, national origin, age, physical or mental disability or medical condition, or any other characteristic protected by applicable law. Our organizational goals match the urgent challenges facing our global ocean, and our mission is designed to help secure a healthy ocean for all. We are committed to building a workforce that is representative of humanityโs diversity, by providing an inclusive and welcoming environment for all employees of Global Fishing Watch and for our partners, vendors, suppliers, and contractors. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Docker, Accounting, Education, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWashington, District of Columbia, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for Doximity and want to re-open this job? Use the edit link in the email when you posted the job!
Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. Achieving this vision requires a multitude of disciplines, expertises and perspective. One of our core pillars have always been data. As a software engineer focused on the infrastructure aspect of our data stack you will work on improving healthcare by advancing our data capabilities, best practices and systems. Our team brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.\n\nThis role can be filled in our San Francisco headquarters OR remotely in the U.S.\n\n**About Us**\n\nOur data teams schedule over 1000 Python pipelines and over 350 Spark pipelines every 24 hours, resulting in over 5000 data processing tasks each day. Additionally, our data endeavors leverage datasets ranging in size from a few hundred rows to a few hundred billion rows. The Doximity data teams rely heavily on Python3, Airflow, Spark, MySQL, and Snowflake. To support this large undertaking, the data infrastructure team uses AWS, Terraform, and Docker to manage a high-performing and horizontally scalable data stack. The data infrastructure team is responsible for enabling and empowering the data analysts, machine learning engineers, and data engineers at Doximity. We provide and evolve a foundation on which to build, and ensure that incidental complexities melt into our abstractions. Doximity has worked as a distributed team for a long time; pre-pandemic, Doximity was already about 65% distributed.\n\nOur [company core values](https://work.doximity.com/)\n\nOur [recruiting process](https://technology.doximity.com/articles/engineering-recruitment-process-doximity)\n\nOur [product development cycle](https://technology.doximity.com/articles/mofo-driven-product-development)\n\nOur [on-boarding & mentorship process](https://technology.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n\n**Here's How You Will Make an Impact**\n\nAs a data infrastructure engineer you will work with the rest of the data infrastructure team to design, architect, implement, and support data infrastructure, systems, and processes impacting all other data teams at Doximity. You will solidify our CI/CD pipelines, reduce production impacting issues and improve monitoring and logging. You will support and train data analysts, machine learning engineers, and data engineers on new or improved data infrastructure systems and processes. A key responsibility is to encourage data best-practices through code by continuing the development of our internal data frameworks and libraries. Also, it is your responsibility to identify and address performance, scaling, or resource issues before they impact our product. You will spearhead, plan, and carry out the implementation of solutions while self-managing your time and focus.\n\n**About you**\n\n* You have professional data engineering or operations experience with a focus on data infrastructure\n* You are fluent in Python and SQL, and feel at home in a remote Linux server session\n* You have operational experience supporting data stacks through tools like Terraform, Docker, and continuous integration through tools like CircleCI\n* You are foremost an engineer, making you passionate about high code quality, automated testing, and engineering best practices\n* You have the ability to self-manage, prioritize, and deliver functional solutions\n* You possess advanced knowledge of Linux, Git, and AWS (EMR, IAM, VPC, ECS, S3, RDS Aurora, Route53) in a multi-account environment\n* You agree that concise and effective written and verbal communication is a must for a successful team\n\n**Benefits & Perks**\n\n* Generous time off policy\n* Comprehensive benefits including medical, vision, dental, generous paternity and maternity leave, Life/ADD, 401k, flex spending accounts, commuter benefits, equipment budget, and continuous education budget\n* Stock incentives\n* and much more! For a full list, see our career page\n\n**More info on Doximity**\n\nJoining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 80% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We're driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people's lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We're growing steadily, and there's plenty of opportunities for you to make an impact.\n\n*Doximity is proud to be an equal opportunity employer and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law.*\n\n\n \n\nPlease mention the word **SATISFACTORILY** when applying to show you read the job post completely (#RMjE2LjczLjIxNi4xODg=). This is a feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.\n\n \n\n#Salary and compensation\n
$120,000 — $160,000/year\n
\n\n#Benefits\n
๐ Distributed team\n\n
\n\n#Location\nNorth America
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. Achieving this vision requires a multitude of disciplines, expertises and perspective. One of our core pillars have always been data. As a software engineer focused on the infrastructure aspect of our data stack you will work on improving healthcare by advancing our data capabilities, best practices and systems. Our team brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.\n\n**About Us**\n\nOur data teams schedule over 1000 Python pipelines and over 350 Spark pipelines every 24 hours, resulting in over 5000 data processing tasks each day. Additionally, our data endeavours leverage datasets ranging in size from a few hundred rows to a few hundred billion rows. The Doximity data teams rely heavily on Python3, Airflow, Spark, MySQL, and Snowflake. To support this large undertaking, the data infrastructure team uses AWS, Terraform, and Docker to manage a high-performing and horizontally scalable data stack. The data infrastructure team is responsible for enabling and empowering the data analysts, machine learning engineers and data engineers at Doximity. We provide and evole a foundation on which to build, and ensure that incidental complexites melt into our abstractions. Doximity has worked as a distributed team for a long time; pre-pandemic, Doximity was already about 65% distributed.\n\nFind out more information on the Doximity engineering blog\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://technology.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://technology.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://technology.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n**Here's How You Will Make an Impact**\n\nAs a data infrastructure engineer you will work with the rest of the data infrastructure team to design, architect, implement, and support data infrastructure, systems, and processes impacting all other data teams at Doximity. You will solidify our CI/CD pipelines, reduce production impacting issues and improve monitoring and logging. You will support and train data analysts, machine learning engineers, and data engineers on new or improved data infrastructure systems and processes. A key responsibility is to encourage data best-practices through code by continuing the development of our internal data frameworks and libraries. Also, it is your responsibility to identify and address performance, scaling, or resource issues before they impact our product. You will spearhead, plan, and carry out the implementation of solutions while self-managing your time and focus.\n\n**About you**\n\n* You have professional data engineering or operations experience with a focus on data infrastructure\n* You are fluent in Python and SQL, and feel at home in a remote Linux server session\n* You have operational experience supporting data stacks through tools like Terraform, Docker, and continuous integration through tools like CircleCI\n* You are foremost an engineer, making you passionate about high code quality, automated testing, and engineering best practices\n* You have the ability to self-manage, prioritize, and deliver functional solutions\n* You possess advanced knowledge of Linux, Git, and AWS (EMR, IAM, VPC, ECS, S3, RDS Aurora, Route53) in a multi-account environment\n* You agree that concise and effective written and verbal communication is a must for a successful team\n\n**Benefits & Perks**\n\n* Generous time off policy\n* Comprehensive benefits including medical, vision, dental, generous paternity and maternity leave, Life/ADD, 401k, flex spending accounts, commuter benefits, equipment budget, and continuous education budget\n* Pre-IPO stock incentives\n* and much more! For a full list, see our career page\n\n**More info on Doximity**\n\nWe're thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company's Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We're driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people's lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We're growing steadily, and there's plenty of opportunities for you to make an impact.\n\n*Doximity is proud to be an equal opportunity employer and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law.*\n\n\n \n\nPlease mention the words **GUESS TRIGGER HURT** when applying to show you read the job post completely (#RMjE2LjczLjIxNi4xODg=). This is a feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.\n\n
\n\n#Benefits\n
๐ Distributed team\n\n
\n\n#Location\nUnited States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.