\nAbout Sayari: \nSayari is the transparency company providing the public and private sectors with immediate visibility into complex commercial relationships by delivering the largest commercially available collection of corporate and trade data as a dynamic model of global ownership and trade activity. Sayariโs solutions harness this model to enable risk resilience, complex investigations, and clear-eyed business decisions. Sayari is headquartered in Washington, D.C., and its solutions are used by thousands of frontline analysts in over 35 countries.\n\n\nOur company culture is defined by a dedication to our mission of using open data to enhance visibility into global commercial and financial networks, a passion for finding novel approaches to complex problems, and an understanding that diverse perspectives create optimal outcomes. We embrace cross-team collaboration, encourage training and learning opportunities, and reward initiative and innovation. If you like working with supportive, high-performing, and curious teams, Sayari is the place for you.\n\n\nPOSITION DESCRIPTION\nSayari provides instant access to structured business information from hundreds of millions of corporate, legal, and trade records for a variety of use cases. As a member of Sayari's data team you will work with our Product and Software Engineering to build the graph that underlies Sayariโs products. \n\n\nPlease note that we cannot provide H1B and/or Visa Sponsorship for this role at this time.\n\n\n\nJob Responsibilities\n* Build and maintain ETL pipelines to process and export record data to Sayari Graph application\n* Develop and improve entity resolution processes\n* Implement logic to calculate and export risk information\n* Work with product team and other development teams to collect and refine requirements\n* Run and maintain regular data releases \n\n\n\n\nRequired Skills & Experience\n* Expertise with Python or a JVM programming language (e.g. Java, Scala)\n* Expertise with SQL (e.g., Postgres) databases\n* 0-2+ years of experience designing, maintaining, and orchestrating ETL pipelines (e.g., Apache Spark, Apache Airflow) in cloud based environments (e.g., GCP, AWS, or Azure).\n\n\n\n\nAdditional Preferred Skills & Experiences\n* Experience with entity resolution, graph theory, and/or distributed computing\n* Experience with Kubernetes\n* Experience working as part of an agile development team using Scrum, Kanban, or similar\n\n\n\n\n\n\n$85,000 - $95,000 a yearThe target base salary for this position is $85000 - $95,000 USD plus bonus and equity. Final offer amounts are determined by multiple factors including location, local market variances, candidate experience and expertise, internal peer equity, and may vary from the amounts listed above.\n\nBenefits: \nยท 100% fully paid medical, vision, and dental for employees and their dependents\nยท Generous time off; we observe all US federal holidays, close our office for a winter break (12/24-12/31), in addition to granting 18 PTO days and 10 sick days \nยท Outstanding compensation package; competitive commissions for revenue roles and quarterly bonuses for non-revenue positions\nยท A strong commitment to diversity, equity, and inclusion\nยท Eligibility to participate in additional benefits such as 401k match up to 5%, 100% paid life insurance (up to $100,000 coverage),, and parental leave\nยท A collaborative and positive culture - your team will be as smart and driven as you\nยท Limitless growth and learning opportunities\n \nSayari is an equal opportunity employer and strongly encourages diverse candidates to apply. We believe diversity and inclusion mean our team members should reflect the diversity of the United States. No employee or applicant will face discrimination or harassment based on race, color, ethnicity, religion, age, gender, gender identity or expression, sexual orientation, disability status, veteran status, genetics, or political affiliation. We strongly encourage applicants of all backgrounds to apply. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python and Engineer jobs that are similar:\n\n
$65,000 — $140,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nUnited States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Zeotap\n\n\nFounded in Berlin in 2014, Zeotap started with a mission to provide high-quality data to marketers. As we evolved, we recognized a greater challenge: helping brands create personalized, multi-channel experiences in a world that demands strict data privacy and compliance. This drive led to the launch of Zeotapโs Customer Data Platform (CDP) in 2020โa powerful, AI-native SaaS suite built on Google Cloud that empowers brands to unlock and activate customer data securely.\n\n\nToday, Zeotap is trusted by some of the worldโs most innovative brands, including Virgin Media O2, Amazon, and Audi, to create engaging, data-driven customer experiences that drive better business outcomes across marketing, sales, and service. With an unique background in high-quality data solutions, Zeotap is a leader in the European CDP market, empowering enterprises with a secure, privacy-first solution to harness the full potential of their customer data.\n\n\n\nResponsibilities:\n* You will design and implement robust, scalable, and high-performance data pipelines using Spark, Scala, and Airflow with familiarity on Google Cloud.\n* You develop, construct, test, and maintain architectures such as databases and large-scale processing systems.\n* You will assemble large, complex data sets that meet functional and non-functional business requirements.\n* You build the infrastructure required for optimal extraction, transformation, and loading (ETL) of data from various data sources.\n* You will collaborate with data scientists and other stakeholders to improve data models that drive business processes.\n* You implement data flow and tracking using Apache Airflow.\n* You ensure data quality and integrity across various data processing stages.\n* You monitor and optimize the performance of the data processing pipeline.\n* You will tune and optimize Spark jobs for performance and efficiency, ensuring they run effectively on large-scale data sets.\n* You troubleshoot and resolve issues related to data pipelines and infrastructure.\n* You stay up-to-date with the latest technologies and best practices in Big Data and data engineering.\n* You adhere to Zeotapโs company, privacy and information security policies and procedures\n* You complete all the awareness trainings assigned on time\n\n\n\n\nRequirements:\n* 2+ years of experience in building and deploying high scale solutions\n* Must have very good problem-solving skills and clear fundamentals of DS and algorithms\n* Expert coding skills in Java or Scala\n* Expert coding skills in Go or Python is a huge plus\n* Apache Spark or other Bigdata stack experience is a mandatory\n* High level and low-level design skills.\n* Deep understanding of any OLTP, OLAP, NoSQL or Graph databases is a huge plus.\n* Deep knowledge of distributed systems and design is a huge plus\n* Hands-on with Streaming technologies like Kafka, Flink, Samza etc is a huge plus\n* Good knowledge of scalable technologies involving Bigdata technologies \n* Bachelor or Masterโs degree in information systems, computer science or other related fields is preferred \n\n\n\n\nWhat we offer:\n* Competitive compensation and attractive perks\n* Health Insurance coverage \n* Flexible working support, guidance and training provided by a highly experienced team\n* Fast paced work environment\n* Work with very driven entrepreneurs and a network of global senior investors across telco, data, advertising, and technology\n\n\n\n\n\n\nZeotap welcomes all โ we are equal employment opportunity & affirmative action employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status.\n \nInterested in joining us?\n \nWe look forward to hearing from you! \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, SaaS, Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$52,500 — $92,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nBengaluru, Karnataka
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Government Employees Insurance Company is hiring a
Remote Staff Engineer
Our Senior Staff Engineer works with our Staff and Sr. Engineers to innovate and build new systems, improve and enhance existing systems and identify new opportunities to apply your knowledge to solve critical problems. You will lead the strategy and execution of a technical roadmap that will increase the velocity of delivering products and unlock new engineering capabilities. The ideal candidate is a self-starter that has deep technical expertise in their domain. Position Responsibilities As a Senior Staff Engineer, you will: Provide technical leadership to multiple areas and provide technical and thought leadership to the enterprise Collaborate across team members and across the tech organization to solve our toughest problems Develop and execute technical software development strategy for a variety of domains Accountable for the quality, usability, and performance of the solutions Utilize programming languages like C#, Java, Python or other object-oriented languages, SQL, and NoSQL databases, Container Orchestration services including Docker and Kubernetes, and a variety of Azure tools and services Be a role model and mentor, helping to coach and strengthen the technical expertise and know-how of our engineering and product community. Influence and educate executives Consistently share best practices and improve processes within and across teams Analyze cost and forecast, incorporating them into business plans Determine and support resource requirements, evaluate operational processes, measure outcomes to ensure desired results, and demonstrate adaptability and sponsoring continuous learning Qualifications Exemplary ability to design, perform experiments, and influence engineering direction and product roadmap Experience partnering with engineering teams and transferring research to production Extensive experience in leading and building full-stack application and service development, with a strong focus on SAAS products / platforms. Proven expertise in designing and developing microservices using C#, gRPC, Python, Django, Kafka, and Apache Spark, with a deep understanding of both API and event-driven architectures. Proven experience designing and delivering highly-resilient event-driven and messaging based solutions at scale with minimal latency. Deep hands-on experience in building complex SAAS systems in large scale business focused systems, with great knowledge on Docker and Kubernetes Fluency and Specialization with at least two modern OOP languages such as C#, Java, C++, or Python including object-oriented design Great understanding of open-source databases like MySQL, PostgreSQL, etc. And strong foundation with No-SQL databases like Cosmos, Cassandra. Apache Trino etc. In-depth knowledge of CS data structures and algorithms Ability to excel in a fast-paced, startup-like environment Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Experience with Micro-services oriented architecture and extensible REST APIs Experience building the architecture and design (architecture, design patterns, reliability, and scaling) of new and current systems Experience in implementing security protocols across services and products: Understanding of Active Directory, Windows Authentication, SAML, OAuth Fluency in DevOps Concepts, Cloud Architecture, and Azure DevOps Operational Framework Experience in leveraging PowerShell scripting Experience in existing Operational Portals such as Azure Portal Experience with application monitoring tools and performance assessments Experience in Azure Network (Subscription, Security zoning, etc.) Experience 10+ years full-stack development experience (C#/Java/Python/GO), with expertise in client-side and server-side frameworks. 8+ years of experience with architecture and design 6+ years of experience in open-source frameworks 4+ years of experience with AWS, GCP, Azure, or another cloud service Education Bachelorโs degree in Computer Science, Information Systems, or equivalent education or work experience Annual Salary $115,000.00 - $260,000.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidateโs work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. Benefits: As an Associate, youโll enjoy our Total Rewards Program* to help secure your financial future and preserve your health and well-being, including: Premier Medical, Dental and Vision Insurance with no waiting period** Paid Vacation, Sick and Parental Leave 401(k) Plan Tuition Reimbursement Paid Training and Licensures *Benefits may be different by location. Benefit eligibility requirements vary and may include length of service. **Coverage begins on the date of hire. Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants. For more than 75 years, GEICO has stood out from the rest of the insurance industry! We are one of the nation's largest and fastest-growing auto insurers thanks to our low rates, outstanding service and clever marketing. We're an industry leader employing thousands of dedicated and hard-working associates. As a wholly owned subsidiary of Berkshire Hathaway, we offer associates training and career advancement in a financially stable and rewarding workplace. Opportunities for Students & Grads Learn more about GEICO Learn more about GEICO Diversity and Inclusion Learn more about GEICO Benefits \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, SaaS, Python, Docker, DevOps, Education, Cloud, API, Senior and Engineer jobs that are similar:\n\n
$47,500 — $97,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nMD Chevy Chase (Office) - JPS
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Sayari: \nSayari is the counterparty and supply chain risk intelligence provider trusted by government agencies, multinational corporations, and financial institutions. Its intuitive network analysis platform surfaces hidden risk through integrated corporate ownership, supply chain, trade transaction and risk intelligence data from over 250 jurisdictions. Sayari is headquartered in Washington, D.C., and its solutions are used by thousands of frontline analysts in over 35 countries.\n\n\nOur company culture is defined by a dedication to our mission of using open data to enhance visibility into global commercial and financial networks, a passion for finding novel approaches to complex problems, and an understanding that diverse perspectives create optimal outcomes. We embrace cross-team collaboration, encourage training and learning opportunities, and reward initiative and innovation. If you like working with supportive, high-performing, and curious teams, Sayari is the place for you.\n\n\nJob Description:\nSayariโs flagship product, Sayari Graph, provides instant access to structured business information from billions of corporate, legal, and trade records. As a member of Sayari's data team you will work with the Product and Software Engineering teams to collect data from around the globe, maintain existing data pipelines, and develop new pipelines that power Sayari Graph. \n\n\n\nJob Responsibilities:\n* Write and deploy crawling scripts to collect source data from the web\n* Write and run data transformers in Scala Spark to standardize bulk data sets\n* Write and run modules in Python to parse entity references and relationships from source data\n* Diagnose and fix bugs reported by internal and external users\n* Analyze and report on internal datasets to answer questions and inform feature work\n* Work collaboratively on and across a team of engineers using agile principles\n* Give and receive feedback through code reviews \n\n\n\nSkills & Experience:\n* Professional experience with Python and a JVM language (e.g., Scala)\n* 2+ years of experience designing and maintaining data pipelines\n* Experience using Apache Spark and Apache Airflow\n* Experience with SQL and NoSQL databases (e.g., columns stores, graph, etc.)\n* Experience working on a cloud platform like GCP, AWS, or Azure\n* Experience working collaboratively with Git\n* Understanding of Docker/Kubernetes\n* Interest in learning from and mentoring team members\n* Experience supporting and working with cross-functional teams in a dynamic environment\n* Passionate about open source development and innovative technology\n* Experience working with BI tools like BigQuery and Superset is a plus\n* Understanding of knowledge graphs is a plus\n\n\n\n\n$100,000 - $125,000 a yearThe target base salary for this position is $100,000 - $125,000 USD plus bonus. Final offer amounts are determined by multiple factors including location, local market variances, candidate experience and expertise, internal peer equity, and may vary from the amounts listed above.\n\nBenefits: \nยท Limitless growth and learning opportunities\nยท A collaborative and positive culture - your team will be as smart and driven as you\nยท A strong commitment to diversity, equity & inclusion\nยท Exceedingly generous vacation leave, parental leave, floating holidays, flexible schedule, & other remarkable benefits\nยท Outstanding competitive compensation & commission package\nยท Comprehensive family-friendly health benefits, including full healthcare coverage plans, commuter benefits, & 401K matching\n \nSayari is an equal opportunity employer and strongly encourages diverse candidates to apply. We believe diversity and inclusion mean our team members should reflect the diversity of the United States. No employee or applicant will face discrimination or harassment based on race, color, ethnicity, religion, age, gender, gender identity or expression, sexual orientation, disability status, veteran status, genetics, or political affiliation. We strongly encourage applicants of all backgrounds to apply. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud and Engineer jobs that are similar:\n\n
$50,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nWe currently seeking a Senior Data Engineer with 5-7 yearsโ experience. The ideal candidate would have the ability to work independently within an AGILE working environment and have experience working with cloud infrastructure leveraging tools such as Apache Airflow, Databricks, DBT and Snowflake. A familiarity with real-time data processing and AI implementation is advantageous. \n\n\n\nResponsibilities:\n* Design, build, and maintain scalable and robust data pipelines to support analytics and machine learning models, ensuring high data quality and reliability for both batch & real-time use cases.\n* Design, maintain, optimize data models and data structures in tooling such as Snowflake and Databricks. \n* Leverage Databricks for big data processing, ensuring efficient management of Spark jobs and seamless integration with other data services.\n* Utilize PySpark and/or Ray to build and scale distributed computing tasks, enhancing the performance of machine learning model training and inference processes.\n* Monitor, troubleshoot, and resolve issues within data pipelines and infrastructure, implementing best practices for data engineering and continuous improvement.\n* Diagrammatically document data engineering workflows. \n* Collaborate with other Data Engineers, Product Owners, Software Developers and Machine Learning Engineers to implement new product features by understanding their needs and delivery timeously. \n\n\n\nQualifications:\n* Minimum of 3 years experience deploying enterprise level scalable data engineering solutions.\n* Strong examples of independently developed data pipelines end-to-end, from problem formulation, raw data, to implementation, optimization, and result.\n* Proven track record of building and managing scalable cloud-based infrastructure on AWS (incl. S3, Dynamo DB, EMR). \n* Proven track record of implementing and managing of AI model lifecycle in a production environment.\n* Experience using Apache Airflow (or equivalent) , Snowflake, Lucene-based search engines.\n* Experience with Databricks (Delta format, Unity Catalog).\n* Advanced SQL and Python knowledge with associated coding experience.\n* Strong Experience with DevOps practices for continuous integration and continuous delivery (CI/CD).\n* Experience wrangling structured & unstructured file formats (Parquet, CSV, JSON).\n* Understanding and implementation of best practices within ETL end ELT processes.\n* Data Quality best practice implementation using Great Expectations.\n* Real-time data processing experience using Apache Kafka Experience (or equivalent) will be advantageous.\n* Work independently with minimal supervision.\n* Takes initiative and is action-focused.\n* Mentor and share knowledge with junior team members.\n* Collaborative with a strong ability to work in cross-functional teams.\n* Excellent communication skills with the ability to communicate with stakeholders across varying interest groups.\n* Fluency in spoken and written English.\n\n\n\n\n\n#LI-RT9\n\n\nEdelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics and data consultancy with a distinctly human mission.\n\n\nWe use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting and connections more meaningful.\n\n\nDXI brings together and integrates the necessary people-based PR, communications, social, research and exogenous data, as well as the technology infrastructure to create, collect, store and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.\n\n\nTo learn more, visit: https://www.edelmandxi.com \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, DevOps, Cloud, Senior, Junior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.