This job post is closed and the position is probably filled. Please do not apply. Work for Lemon.io and want to re-open this job? Use the edit link in the email when you posted the job!
โโโโโโโWe also have an ASAP project for a Senior Backend Developer for an AI GRC company (Python and Node.js) (only European timezone) with a rate up to $40/hour.
Are you a talented senior engineer looking for a remote job that lets you show your skills and get better compensation and career growth? Look no further than Lemon.io - the marketplace that connects you with hand-picked startups in the US and Europe.
What do we offer:
- We respect your time: here is no micromanagement or screen trackers. - You can earn with us $5k - $8.5k monthly - the rate depends on your skills and experience. We've already paid out over $10M to our engineers. - You will enjoy your work - itโs possible to communicate async and choose a schedule that works best for you. - You will communicate directly with the clients. Most of them have technical backgrounds. Sounds good, yeah? - We will support you from the time when the application will be started during all our cooperation. - No more hunting for clients or negotiating rates - let us handle the business side of things so you can focus on what you do best. - We'll manually find you the best project according to your skills and preferences. - You will work at the fast-paced startup environment that will keep you motivated and engaged. - We will connect you with the best developers in the world through our community.
We also collaborate with other companies through staff augmentation. More details are [here](https://lemon.io/partnership-with-lemonio/).
Who we are looking for:
- Senior/Senior+ Data Scientist & Data Engineer.
Also we are seeking for:
- Senior/Senior+ ML & Data Scientist - Senior/Senior+ AI & Data Scientist - Senior/Senior+ AI & ML
Requirements:
- Proven experience in either Data Science or Data Engineering, with a minimum of 3 years of hands-on experience. - At least 2 years of commercial experince with AI/ML. - Hands-on experience with Python. - Experience with AWS/GCP/Azure, SQL, and Airflow is a must - Familiarity with NoSQL databases. - Hands-on experience with the following technologies Spark, Hadoop, PowerBI, and Lookeri BigQuery would be a huge plus. - Ability to work with large datasets and write efficient code capable of processing and analyzing data at scale. - Strong analytical and problem-solving skills, with the ability to extract insights and patterns from complex data. - Good command of English, both written and spoken, as youโll be communicating with clients directly - Strong organizational skills โ ability to work full-time remotely with no supervision - Responsibility โ we want to trust you - Soft skills โ we value clear and effective communication, at the same time, don't force you becoming a public speaker
ALSO, we have a large number of different projects for Senior Full-Stack Developers, so if you have 4+ years of commercial experience in software development you are fluent with Python, Ruby on Rails, React.js or React Native - we would be happy to communicate and provide you a project which matches with your experience. Just apply, and we will share with you more details.
Ready to take your career to the next level? Apply now and join the Lemon.io community!
If your experience matches with our requirements be ready for the next steps:
- VideoAsk (about 10 minutes) - Completing your me.lemon profile - 30 minutes Screening call with our Recruiters - Technical Interview with our Developers - Feedback - Magic Box (we are looking for best project for you)
P.S. We work with developers from 59 countries in different regions: Europe, LATAM, Asia (Philippines, Indonesia), Oceania (Australia, New Zealand, Papua New Guinea), Canada and the UK. However, we have some exceptions.
At the moment, we donโt have a legal basis to accept applicants from certain European countries: Albania, Belarus, Bosnia and Herzegovina, Croatia, Iceland, Liechtenstein, Kosovo, Montenegro, North Macedonia, Russia, Serbia, and Slovenia. Additionally, there are a few countries in Latin America from which we cannot accept applicants: Cuba and Nicaragua, as well as most Asian countries. Furthermore, we are unable to accept applicants from Africa.
Please note that due to the overwhelming number of applications, only suitable candidates will be contacted for an interview.
We strongly ask you to send your CVs in ENGLISH. Application in English will be considered first. Good luck to everyone!
Please mention the word STRIKING when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$60,000 — $90,000/year
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Revelator and want to re-open this job? Use the edit link in the email when you posted the job!
ABOUT US:
Revelator is a leading provider of business solutions for the music industry. Our all-inclusive music distribution platform, API, protocol, and web3 infrastructure, enhances efficiency in music distribution, financial reporting and simplifies royalty operations. We offer a wide range of services, including catalog management, supply chain, income tracking, rights management, and business intelligence. By leveraging our innovative solutions, music businesses can easily navigate the evolving landscape and capitalize on new opportunities.
THE ROLE:
The Data Ops Engineer is responsible for day-to-day technical development and delivery of data pipelines into Revelatorโs data and analytics platform. You will ensure delivery of solutions based on the backbone of good architecture, best data engineering practices around operational efficiencies, security, reliability, performance and cost optimization.
Key Responsibilities:
Design, build and optimize data engineering pipelines to extract data from different sources and applications and feed into cloud data platform.
Build, test and productize data extraction, transformation and reporting solutions within cloud platform.
Provide accurate and timely information that can be used in day to day operational and strategic decision making.
Code, test, and document new or modified data models and ETL/ELT tools to create robust and scalable data assets for reporting and analytics.
Contribute to our ambition to develop a best practice Data and Analytics platform, leveraging next generation cloud technologies.
Define and build the data pipelines that will enable faster, better, data-informed decision-making within the business.
Ensure data integrity within reports and dashboards by reviewing data, identifying and resolving gaps and inconsistencies, and escalating as required to foster a partnered approach to data accuracy for business reporting purposes.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field.
5+ years of relevant work experience as a Data Ops/Data Integration Engineer, including:
Building ETL/ELT solutions for large scale data pipelines.
General expertise with SQL and database management (Azure SQL Server). Including performance optimization.
Experience with CI/CD for data pipelines
Using Data Ops to develop data flows and the continuous use of data.
Data modeling
Data analysis
Developing technical and support documentation, translate business requirements and needs into reporting and models.
Required Technical Skills
Azure Data Factory
PowerBI, PowerBI scripting & automation
Snowflake, Snowpipes
.NET / C#
Python
SQL, Stored Procedures
Other Skills
Excellent problem-solving skills and the ability to work independently.
Strong teamwork and collaboration skills with the ability to lead and mentor junior developers.
Exceptional communication skills, both written and verbal in English.
Please mention the word AFFECTION when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$70,000 — $100,000/year
Benefits
๐ Distributed team
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
Proxify is hiring a Remote Senior Microsoft PowerBI Developer
About us:
Talent has no borders. Proxify's mission is to connect top developers around the world with opportunities they deserve. So, it doesn't matter where you are; we are here to help you fast-track your independent career in the right direction. ๐
Since our launch, Proxify's developers have successfully worked with 1200+ happy clients to build their products and growth features. 3500+ talented developers trust Proxify and its network to fulfill their dreams and objectives.
Proxify is shaped by a global network of supportive, talented developers interested in remote full-time jobs. Our Glassdoor (4.5/5) and Trustpilot (4.8/5) ratings reflect the trust developers place in us and our commitment to our members' success.
The Role:
We are looking for a Senior Microsoft Power BI developer with commercial experience for one of our clients. You are a perfect candidate if you are growth-oriented, you love what you do, and you enjoy working on new ideas to develop exciting products and growth features.ย
What we are looking for:
Background with BI tools and systems such as Power BI, Tableau, and SAP
Prior experience in data-related tasks
Understanding of the Microsoft BI Stack
Mastery of data analytics
Proficiency in software development
Familiarity with MS SQL Server BI Stack tools and technologies, such as SSRS and T-SQL, Power Query, MDX, PowerBI, and DAX
Analytical thinking for converting data into relevant reports and graphics
Ability to handle row-level data security
Knowledge of Power BI application security layer models
Ability to run DAX queries on Power BI desktop
Proficiency in doing advanced-level computations on the data set
Excellent communication skills are required to communicate needs with clients and internal teams successfully
Nice-to-have:ย
Time zone: CET (+/- 3 hours)
Responsibilities:
Convert business needs into technical specifications and establish a timetable for job completion
Create, test, and deploy Power BI scripts, as well as execute efficient deep analysis
Use Power BI to run DAX queries and functions
Create charts and data documentation with explanations of algorithms, parameters, models, and relationships
Construct a data warehouse
Use SQL queries to get the best results
Make technological adjustments to current BI systems to improve their performance
Analyse current ETL procedures to define and create new systems
What Proxify offers
Career-accelerating positions at cutting-edge companies Discover exclusive long-term remote engagements at the world's most interesting product companies.
Hand-picked opportunities, just for you Skip the typical recruitment roadblocks and biases with personally matched engagements.
Fast-track your independent developer career Start small and gain more freedom to take on new engagements as you build your independent developer career.
A recruitment process that values your time Only one hiring process with the possibility of several positions, without any additional tests.
Please mention the word DEXTROUS when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Talent has no borders. Proxify's mission is to connect top developers around the world with opportunities they deserve. So, it doesn't matter where you are; we are here to help you fast-track your independent career in the right direction. ๐
Since our launch, Proxify's developers have successfully worked with 1200+ happy clients to build their products and growth features. 3500+ talented developers trust Proxify and its network to fulfill their dreams and objectives.
Proxify is shaped by a global network of supportive, talented developers interested in remote full-time jobs. Our Glassdoor (4.5/5) and Trustpilot (4.8/5) ratings reflect the trust developers place in us and our commitment to our members' success.
The Role:
We are looking for a Senior Data Engineer for one of our clients. You are a perfect candidate if you are growth-oriented, you love what you do, and you enjoy working on new ideas to develop exciting products.ย
What we are looking for:
+5 years of solid experience as a Data Engineer in top-notch environment.
+3 years of experience with Cloud Infrastructures (e.g. Azure or AWS), virtualisation and containerisation environments (e.g. VMware, Docker, Kubernetes).
Strong knowledge of software development processes including testing, continuous integration/delivery, automated deployment and verification/maintenance.
High-degree of ambition for self-improvement and self-initiative.
Ability to work with minimal supervision.
Intermediate-advanced English level.
You can communicate well with both technical and non-technical clients.
Nice-to-have:ย
Experience in data warehousing and data modeling.
Solid understanding of relational database systems.
Knowledge in working with Apache Spark.
Time zone: CET (+/- 3 hours).
Azure certifications in Cloud development and architecture would be a plus.
Responsibilities:
Competency in Implementing the best practices in AI/ML development to ensure the data pipelines and solutions are:
Effectively and efficiently tailored towards specific applications (automated processes on hybrid cloud/on-prem infrastructure).
Scalable and maintainable to address an extensive customer community.
Secure โon-premโ to protect the clientโs IP.
Knowledgeably built with the infrastructure upon which the prediction models will run.
What Proxify offers
Career-accelerating positions at cutting-edge companies Discover exclusive long-term remote engagements at the world's most interesting product companies.
Hand-picked opportunities, just for you Skip the typical recruitment roadblocks and biases with personally matched engagements.
Fast-track your independent developer career Start small and gain more freedom to take on new engagements as you build your independent developer career.
A recruitment process that values your time Only one hiring process with the possibility of several positions, without any additional tests.
Please mention the word PERMISSIBLE when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for Nadine West and want to re-open this job? Use the edit link in the email when you posted the job!
Data Analyst - Senior Analyst for Customer Analytics & Insights
This is not your average data analyst role.
We're on the hunt for a data whiz who thrives on turning data into strategic insights that drive customer engagement, boost LTV, and fuel business growth. This is more than just a 9-to-5 gig; it's about transforming raw data into actionable recommendations that revolutionize our marketing, sales, and merchandising game. But let's not get ahead of ourselves.
First things first:
Normal Job: You receive instructions, you follow them to the letter.
Nadine West: You embrace the data, you seek out opportunities for improvement, and you help write the playbook for success.
What Is The Role?
We don't want to box you into a rigid job description. This isn't about conforming to a predefined role; it's about unleashing your data mastery to achieve our goals.
So, here's the deal:
Identify Opportunities: Dive into the data to uncover what's holding us back from reaching our goals.
Take Action: Turn those insights into action, and do it with lightning speed.
What You Bring To The Table:
Data is your playground, and you're the master of its game. You speak the language of SQL and Google Analytics like a pro, cloud and big data platforms don't scare you either. Tableau is your trusty sidekick in this adventure.
You've got an artistic side too - not in the traditional sense, but in the way you craft A/B tests and analyze statistical techniques to measure the effectiveness of our marketing, sales, and merchandising strategies. Excel and PowerPoint are second nature to you, trusty allies in presenting your findings with finesse.
But here's the secret sauce:
You're not just crunching numbers; you're telling stories. Your insights breathe life into our CRM and Digital analytics, enabling us to optimize campaigns, enhance audience targeting, and craft compelling creative messages that resonate with our customers.
Leadership:
We're not looking for someone to follow a script; we want someone to write it. Step up, take charge, and guide us with your data-driven brilliance. We thrive on pushing boundaries, and we expect nothing less from you. If something doesn't add up, you're not afraid to call it out. Your eye for detail ensures that nothing slips through the cracks.
Scale:
We're a close-knit team, and we all wear multiple hats. You'll have to do the same, nothing is too big and nothing is too far in the weeds.
This Is Interesting. Now What?
Talk is cheap, and resumes only tell part of the story. We believe in putting our candidates to the test - a paid test. It's not a grueling marathon; think of it as a data-driven sprint (3-5 hours of work).
Show us your stuff:
How you dig deep to identify opportunities
How you transform data into actionable insights, even with limited exposure to our company The breadth of your skills beyond mere data analysis.
Please mention the word ADVANTAGES when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$30,000 — $120,000/year
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Proxify and want to re-open this job? Use the edit link in the email when you posted the job!
About us:
Talent has no borders. Proxify's mission is to connect top developers around the world with the opportunities they deserve. So, it doesn't matter where you are; we are here to help you fast-track your independent career in the right direction. ๐
Since our launch, Proxify's developers have successfully worked with 1200+ happy clients to build their products and growth features. 3500+ talented developers trust Proxify and its network to fulfill their dreams and objectives.
Proxify is shaped by a global network of supportive, talented developers interested in remote full-time jobs.
The Role:
We are looking for a Senior Data Engineer for one of our clients. You are a perfect candidate if you are growth-oriented, you love what you do, and you enjoy working on new ideas to develop exciting products.ย
What we are looking for:
+5 years of solid experience as a Data Engineer in a top-notch environment.
+3 years of experience with Cloud Infrastructures (e.g. Azure or AWS), virtualization, and containerization environments (e.g. VMware, Docker, Kubernetes).
Strong knowledge of software development processes including testing, continuous integration/delivery, automated deployment, and verification/maintenance.
High degree of ambition for self-improvement and self-initiative.
Ability to work with minimal supervision.
Intermediate-advanced English level.
You can communicate well with both technical and non-technical clients.
Nice-to-have:ย
Experience in data warehousing and data modeling.
Solid understanding of relational database systems.
Knowledge in working with Apache Spark.
Time zone: CET (+/- 3 hours).
Azure certifications in Cloud development and architecture would be a plus.
Responsibilities:
Competency in Implementing the best practices in AI/ML development to ensure the data pipelines and solutions are:
Effectively and efficiently tailored towards specific applications (automated processes on hybrid cloud/on-prem infrastructure).
Scalable and maintainable to address an extensive customer community.
Secure โon-premโ to protect the clientโs IP.
Knowledgebly built with the infrastructure upon which the prediction models will run.
What Proxify offers
Career-accelerating positions at cutting-edge companies Discover exclusive long-term remote engagements at the world's most interesting product companies.
Hand-picked opportunities, just for you Skip the typical recruitment roadblocks and biases with personally matched engagements.
Fast-track your independent developer career Start small and gain more freedom to take on new engagements as you build your independent developer career.
A recruitment process that values your time Only one hiring process with the possibility of several positions, without any additional tests.
Please mention the word IMPROVEMENTS when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$50,000 — $80,000/year
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for CoinLedger and want to re-open this job? Use the edit link in the email when you posted the job!
You will be the owner of the import experience within the CoinLedger platform which currently processes hundreds of millions of transactions every year spread out across 300+ centralized exchange integrations. Supporting such a significant number of integrations is a large undertaking and will require the ability to implement new systems in order to maintain the quality our users expect.
At first, this will be a completely technical role where you work closely with Lucas and Mitchell (CoinLedgerโs two technical co-founders) to get familiarized with the system, understand our processes, and take over development for net new and existing integrations. Your goal here is to become an expert with our current approach to building integrations and become a vacuum for new data across crypto platforms that CoinLedger does not yet support.
Once this ramp-up phase is complete, your role will transition to a split of managerial work and technical work. You will work alongside the co-founders to bring on a team of 3-4 other developers and help them ramp up with the knowledge you have gained. You and the team will be responsible for continuously shipping improvements to our current integrations, launching new integrations, and working to make the import experience as amazing as possible.
With the team in place, your technical responsibilities will shift from individual integration work to leading long term and high leverage projects which will impact the integration experience as a whole. On the near-term roadmap, we have the following in mind:
Improving import infrastructure to enable real-time transaction syncing from crypto exchanges
Building robust monitoring systems to react to breaking changes from crypto exchanges
Building new workflows for enabling realtime balances to be fetched from crypto exchanges to assist users with reconciling their transaction history
This will be an extremely high impact, challenging role in a fast paced environment. However, you will have the opportunity to shape and impact the creation of a net-new team within CoinLedger from the ground up.
We hope you are up for the challenge.
About you
You are a crypto enthusiast and invest in crypto yourself
You have a strong understanding of major centralized players in the crypto market and can prioritize accordingly in your integration work
You have a passion for working with data: APIs, csv files, and excel spreadsheets
You arenโt afraid of diving heads first into poorly documented APIs for crypto exchanges
You have 3+ years experience as a software engineer working on backend systems
You have worked professionally on a C# / .NET codebase
You are able to strike the balance between speed of delivery with quality of development
You have the ability to zoom-in and help solve technical issues when the need arises
You have experience and are comfortable managing a team of developers, including assigning work, leading stand-ups, performing code-reviews, assisting with QA, and communicating business context for projects as needed
You are comfortable working closely with members of our support team in order to help resolve issues and improve the import experience
You are comfortable working with outside contractors to source transaction history data from crypto exchanges in other countries
You provide technical leadership to the team and can work with engineers to help guide them into important technical decisions.
You have experience working with engineers at different levels and have coached them in their career development.
In the first month, you will:
Work directly alongside CoinLedgerโs two technical co-founders to gain an understanding of the current import experience
Ship bug fixes and improvements for our existing integrations
Perform an audit on high impact API integrations and execute a plan to improve both the success rate and percentage of transactions CoinLedger imports
Build and ship a new file integration
Build and ship a new API integration
Document and create a plan to improve gaps in current processes
Within the first 3 months, you will:
Work alongside the founding team to hire 3-4 additional engineers who will report to you
Lead the onboarding process for the new engineers and bring them up to speed on team principles and processes so they will be able to build and maintain integrations
Identify key areas to improve import resiliency and ship these changes
Take over leading the core integrations team operations
Within the first 6 months, you will:
Have autonomy over the direction of the integrations team and roadmap
Ship multiple improvements to improve both the resiliency of our integrations and the overall operations of the integrations team in general
Ship multiple features which improve the import experience for our users
What success looks like:
You lead the team to achieve targets for team goals which include:
Accuracy of imported accounting data
Success rate of both high priority and secondary exchanges
Measured classification coverage of imported transactions
You push the pace to improve
Development velocity of new integration launches
Time to recovery when integrations break
Development velocity on new features
You identify problem areas in our current process and implement improvements which will:
Make the import experience better for our customers
Make the development & debugging process better for the whole integrations team
What we offer:
$100,000 - $120,000 USD / year salary
Competitive equity in a fast-growing startup
Completely remote position
Autonomy over building and running your own engineering team
Healthcare benefits
Paid time off
Please mention the word EFFORTLESS when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$100,000 — $120,000/year
Benefits
๐ Distributed team
โฐ Async
๐ค Vision insurance
๐ฆท Dental insurance
๐ Medical insurance
๐ Unlimited vacation
๐ Company retreats
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Paperpile and want to re-open this job? Use the edit link in the email when you posted the job!
Description
Paperpileย helps researchers to find, organize, read, and write academic papers.
Our user base is growing fast, and so is the team around it.
As part of a small team,ย everything you do matters. You only work on stuff that has a direct big impact on the product and our customers.
Work from anywhere on your own schedule. As a remote-first company, we communicate with tools like Slack, Figma, and Notion to move our product forward fast.
Work with an interesting and diverse community of academics. Our customers use Paperpile to study climate change, cancer, or medieval history. You interact with them directly to understand how we can make their work more productive.
Our backend infrastructure stores, processes, and searches hundreds of millions of academic articles. We use Node.js, TypeScript, MongoDB, ElasticSearch, and AWS.
You will take responsibility for substantial parts of our codebase to provide aย fast and reliable backend for all our productsย (web, mobile, desktop). You can also work onย data-heavy projects, including applications of large language models and AI.
Requirements
You can work independently and writeย clean, reusable, and testable code.
You can work and communicate within aย Scrumย team and produce production-ready code efficiently and on time.
You knowย Node.js, TypeScript/JavaScript,ย or have used a different backend stack and want to learn Node.js/TypeScript.
You have worked onย data-heavy applicationsย before and have experience with the required database and backend technologies.
You can design and implementย REST APIs.
More useful experience (optional):
AWS, ElasticSearch, MongoDB, Linux
Basic data science skills (web scraping, data transformations, data cleaning, data normalization, ...)
AI and language models
Background in research or academia
Benefits
Base compensation โฌ40,000-โฌ72,000 based on the level of your experience (plus variable bonus).
4 weeks paid vacation + local holidays.
Learn and grow. Try out new things. We sponsor relevant courses, seminars, and conferences.
Please mention the word EVALUATIVE when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$40,000 — $80,000/year
Benefits
๐ Distributed team
๐ Paid time off
๐ Learning budget
๐ฅ Home office budget
โฌ๏ธ No whiteboard interview
๐ No monitoring system
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Tessera and want to re-open this job? Use the edit link in the email when you posted the job!
Total Compensation Value:
$80k - $180k (Salary + equity)
NOTE: Actual total compensation offer will vary based on applicant location / cost of living, skillsets, and level of relevant experience
Time zones:
Eastern (UTC -05:00) and Central European UTC (+1:00)
About the company:
Tessera provides ownership of the worldโs most sought after NFTs! Working at Tessera, you will be building on the cutting edge of art, finance, and blockchain technology to help shape the future of digital collecting experiences.
You will get to work, learn and grow with an experienced team supported by incredible partnerships and committed investments from developers, collectors, investors, and thought leaders deeply passionate about the decentralization ecosystem.
We are looking for an exceptional data engineer to join our team. They will work closely with our CTO and web stack team to build our databases with an anticipation of future data needs, identify where to find that data, and build scalable backend infrastructure. The role will be expected to discover and aggregate data from different sources and blockchains using their skillsets which should include designing database architecture, data structures, pipelines, ETL processes, API/SDK development, and some backend software development as we build our foundation for rapid growth, future data science initiatives, and continued innovation in this exciting space! Experience with blockchain, NFTs, and DeFi is preferred.
Integrations: Various APIs, browser-based crypto wallets (e.g., MetaMask) etc.
What to expect
Make a HUGE impact helping to bring ideas to reality
Learn highly valuable, and complex concepts related to art, finance, and technology in a full-time job in an industry growing exponentially, on a team with some of the leading NFT influencers
Become a core member of a very passionate team in a friendly environment
Work within a dynamic team, who challenges the status quo and champions agile working plus continuous improvement
In this role you will be expected to...
Work with our development team to continually release technical enhancements
Be responsible for identifying and anticipating our data needs and optimizing our backend software infrastructure, setting data priorities, and building a scalable foundation for our fast evolving siteย tessera.co
Own the data lifecycle (from ingest and automated quality checks, to discovery and usage, and database setup)
Build custom integrations between cloud-based or blockchain-based systems using APIs and other data sources
Design efficient data structures, database schemas and ETL for long-term sustainability
Ship high-quality, well-tested, secure, and maintainable code
Write server scripts and APIs
Routinely inspect server code for speed optimization and practical trade-offs
Build a scalable NFT metadata backend infrastructure forย tessera.co
Incorporate data processing and workflow management tools into pipeline design (AWS etc.)
Design, develop, and optimize data pipelines and backend services for real-time decisioning, reporting, data collection, and related features / functions
Drive strategic technology decisions related to the appropriate data stores for the job (e.g., warehouses etc.)
(Long-term) Architect, build, and launch new data models that provide intuitive analytics to the team
Wrangle large-scale data sets from the blockchain and other site APIs (e.g., OpenSea)
Build data expertise and own data quality for the pipelines you create
Strong technical and non-technical communication abilities, both verbal and written
Our fast-paced, agile development environment will require a penchant for task management and respect for efficient, best practice development principles as well!
What weโre looking for
6+ months of tinkering and/or participating somewhere in Web3 (DeFi, NFTs, DAOs, etc.)
Experience with querying and interacting with EVM & non-EVM Based blockchains
Understanding of low-level idiosyncrasies of popular blockchains including Ethereum, Solana
3 or more years of relevant software experience in a data or backend-focused role
Strong experience with two or more of the following languages: Python, SQL, Javascript, Scala
Experience designing data structures, database schemas and ETL pipelines from scratchย
Experience with workflow systems such as Apache Airflow2 or more years of professional work experience on ETL pipeline implementation using services such as Pyspark, Glue, Dataflow, Lambda, Athena, S3, GCS, SNS, PubSub, Kinesis, etc.
Experience with scalable cloud-based solutions
A pro-active and autonomous team player, self-starter with the ability to anticipate future needs
Capable of prioritizing multiple project in order to meet goals without management oversight
Experience in communicating with users, other technical teams, and product management to understand requirements, describe data priorities and challenges, and technical design needs
Excellent writing skills and the ability to drive via influence
Proficiency in the English language, both written and verbal, sufficient for success in a remote and largely asynchronous work environment
Strong attention to detail
Bonus points for
BS/MS in Computer Science, Computer Engineering, or a related technical field
Previous experience in a rapidly scaling start-up environment
Professional work experience using real-time streaming systems (Kafka/Kafka Connect, Spark, Flink or AWS Kinesis)
Previous experience building Analytics/BI systems from scratch
Previous experience building large-scale data architectures
Previous experience in BI or Data Science
What we're offering
Competitive salary (and equity) in an exciting space driving disruptive innovation
The opportunity to play a key voice in our growing organization
A remote work environment with competitive benefits and holidays
7 additional company holidays, including all-company week-long winter break
Medical, Dental, and Vision Insurance for US-based employees
Agile working environment with flexible working hours and location, career advancement, and competitive compensation package
Optional offsite social events to help our employees become familiar with each other and our culture
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States
ย If youโre convinced you are the right fit and you canโt wait to join our team, we look forward to hearing from you!
Once you've applied, please be patient :) it may take us up to 2-3 weeks to get back to you!
Donโt meet every single requirement?
Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Tessera we are dedicated to building a diverse, inclusive and authentic workplace, so if youโre excited about this role but your past experience doesnโt align perfectly with every qualification in the job description, we encourage you to apply anyways.
You may be just the right candidate for this or other roles.
Please mention the word EXHILARATE when applying to show you read the job post completely (#RMTAwLjI2LjM1LjExMQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$80,000 — $180,000/year
Benefits
๐ Distributed team
โฐ Async
๐ค Vision insurance
๐ฆท Dental insurance
๐ Medical insurance
๐ Unlimited vacation
๐ Company retreats
๐ฐ Equity compensation
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.