Data Scientist & Credit Scoring Specialist
About Rubyx
Hello, we're an international team with expertise in microfinance, data science, behavioral economics, and human-centered design. We started Rubyx in 2019 to revolutionize the way informal entrepreneurs and small businesses fund their activities.
We've built a cloud-native, API-driven digital lending platform that enables service providers to rapidly launch innovative loan products that meet the needs and expectations of borrowers. Rubyx lowers the cost of entry to AI capabilities and makes credit scoring algorithms easy to understand so that the power of data benefits all.
Pioneering new lending ecosystem models, we support different types of partnerships:
- We enable non-lenders such as digital platforms to easily embed lending features into their services to deliver more value to their customers.
- We work with banks and microfinance institutions to help them accelerate their digital transformation and reach new markets.
Currently, we operate in the following countries: Cambodia, Cameroon, Nigeria, Ivory Coast, Ghana, Tunisia, Jordan, Georgia, India, and Madagascar.
We're passionate about the many opportunities to rethink microfinance together, but we're not just painting pretty pictures; we draw our inspiration from many years of experience in microfinance and spending days and days in the field to observe the ways informal entrepreneurs and small businesses work.
Working with us
We each work with a high degree of autonomy and take responsibility for our tasks. Using tools like Slack and Trello, we stay focused on the most important activities. Joining us brings the benefit of being able to shape and build your practice in a start-up environment.
We work remotely and get things done. Our team is spread out from Canada to China, sometimes you'll have to put an effort to schedule meetings that work for everyone. Mostly, for you, this means flexible working hours.
We believe work should be fairly rewarded. We offer competitive compensation based on your experience and expertise.
Job description
One of the core activities of Rubyx is the development of credit-scoring algorithms for risk management. We are looking for a data scientist, a specialist in credit scoring to help us to expand our existing algorithms and push further the quality of services offered by Rubyx.
Responsibilities
- Collaborates with the Data team (data engineering, analytics, science) to develop the algorithmic solution of Rubyx
- Work with product teams, users, and other stakeholders to identify opportunities for data-driven solutions.
- Translate clients' business needs into technical requirements and perform the data processing and modeling necessary to meet those needs
- Participate in designing the architecture of solutions based on GCP infrastructure
- Design and implement statistical and AI models for risk modeling and credit-related activities
- Define experimental settings to test hypotheses and refine modeling
- Contribute to R&D and internal asset development projects
- Co-design models based on client needs and communicate the results of the resulting algorithms
- Document solutions, configuration, and program modifications according to internal standards.
Requirements
Professional Skills
- 5+ years of experience in a data science (or similar) role
- Understanding of the Banking industry and business processes within Risk in emerging markets.
- Strong experience with credit scoring in various environments (bank, MFI, digital platform...) and context (application scoring, renewal scoring, collection scoring…).
- Expertise with database, data warehouse, and data pipeline technologies
- Experience with cloud infrastructure (GCP preferred) and SaaS environments.
- Experience with data quality best practices
- Experience in languages commonly used for data analysis such as SQL, Python, and R.
- Excellent technical competence and experience performing analysis across a breadth of techniques (e.g. exploratory analytics / causal inference/experimentation / ML techniques)
- Strong statistical knowledge and intuition, including experience designing, monitoring, and analyzing A/B tests
- Ability to start with a messy unstructured problem, identify specific research questions from it, and propose data and model-driven solutions that would create value for end-users and other stakeholders.
- Extensive experience working with large-scale datasets and using data engineering best practices (e.g. CI/CD, testing, git)
- Dedication to high-quality, maintainable code.
- Good verbal and written communication skills along with the ability to present technical data and approaches to both technical and non-technical audiences.
Personal Skills
- Fluent in English, and French is a plus.
- Open-minded and able to work with people from different backgrounds and cultures.
- Cultivate the start-up spirit and look for innovation and excellence.
- Demonstrated ability to work independently and as part of a team in a distributed working environment.
- Ability to learn and adapt quickly in fast-changing environments.

At Fors Marsh, we combine the power of science and strategy to improve people's lives. Each day, we work with institutions and organizations that seek to disrupt markets, understand and influence behavior, drive action on a national scale, and create positive impact. Our approach extends far beyond our client portfolio—as a certified B Corporation and a 2020 Greenbook Top 50 Market Research Company, we make a difference in our community through corporate-sponsored employee volunteer programs and pro bono partnerships with values-aligned nonprofits. Most importantly, as a 2019-2022 Top Workplace, we are committed to putting people first and foster a culture that reflects that commitment. We are proud to be an equal opportunity employer, and we celebrate ersity and inclusivity as the foundation of a healthy, successful, and innovative work environment. Join us, and together we can work to ensure a better tomorrow.
Fors Marsh is seeking an intelligent and motivated early career researcher needed to work as part of our Military Analytics research team. Our Military Analytics team bridges the gap between traditional social science and data science, leveraging innovative analytic tools for research committed to improving the health and well-being of Service members and DoD personnel. This inidual's primary responsibility will be conducting and reporting upon analyses of survey, behavioral/archival, and geographic data. This job is best suited for someone who enjoys integrating and analyzing data, synthesizing market research into actionable recommendations, and who is comfortable working collaboratively with team members and clients.
Responsibilities include:
- Analyzing market research data, including data from large-scale tracking surveys as well as geographic, behavioral, and administrative data.
- Supporting all aspects of the research process, including survey creation, data cleaning, data analysis, and reporting.
- Conducting market segmentation analyses and evaluating drivers of attitudes and behaviors.
- Working to aggregate, organize, and explore large, complex quantitative data sets through a variety of techniques.
- Analyzing data and interpreting results from descriptive and inferential analyses to identify patterns and solutions.
- Writing written reports and professional briefings that summarize research findings to non-technical audiences.
- Providing technical- and non-technical support to project teams
- Upholding the highest standards of quality control for yourself and others.
Qualifications:
- Bachelor's degree in social science, statistics, economics, computer science, humanities or a related field.
- Strong verbal and written communications skills.
- Ability to work effectively independently, and as a team member.
- Moderate level of proficiency in at least one statistical analysis software package (R and/or Stata preferred), including syntax-based analyses and data manipulation (e.g., collapsing, merging).
- Experience working with existing datasets and integrating from multiple sources
- Proficiency with Microsoft Office® products (e.g., Word, PowerPoint, and Excel®)
- Strong quantitative ability and a keen eye for detail and accuracy.
- Applicants may be subject to a low-level government security investigation and must meet eligibility criteria for access to sensitive information.
- US Citizenship Required
We Offer:
- Ability to make an impact on people's lives, both internal and external to the organization.
- Top-tier health, dental, vision, and long and short-term disability coverage all covered at 100% for employee coverage.
- Remote work.
- Our company culture, which values balance and allows each employee to take leave as they require it to balance the responsibilities of both their work and home lives without worrying about depleting their available leave hours.
- We provide a floating holiday bank so you can celebrate the days you value.
- Generous matching retirement contributions and no vesting period starting the third month of employment.
- Dedicated training and development budgets to expand your expertise and grow your skill set.
- You can volunteer your way with paid time off.
- You can participate in Fors Marsh staff-led affinity groups.


data🇺🇸usa only
Why We Work at Dun & Bradstreet
Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and erse thinkers are always welcome. Come join us!
Dun & Bradstreet is looking for a data analyst to join our team, ideally located at our Jacksonville, FL world headquarters. The role will be to provide engineering support to the data operations team by processing customer data files. Outlined client deliverables will help show the value of data that is waiting to be unlocked for insights and targeted marketing activities in the digital ecosystem. The role sits within the Audience Targeting ision of Dun & Bradstreet and is a fast-growing product line within the company - offering a high degree of career growth and development opportunities.
This role reports to the Director of Licensing Data Operations in the United States and will work closely with many departments including Technology, Product, Sales, and Customer Success. In this role, you will work to understand client business needs, then translate this to action by analyzing data and developing solutions for curating our vast array of data assets into solutions for some of the world’s largest brands. Your colleagues include Adtech, Direct Marketing and data industry veterans. Together, you will support the most sophisticated Data Science and Marketing teams at the Fortune 1k companies and some of the fastest growing companies in the world.
< class="h3">Responsibilities:

- Work in SQL based systems including Redshift, Big Query, DataBricks, & Command line
- Interface with Dev Ops & Engineering to solve technical issues
- Process client deliverables accessing redshift tables, legacy exporter tools & legacy export systems
- Complete work requests within engineering sprint schedules documented in Jira, Confluence, Excel spreadsheets
- Interact with data operations and internal client teams, and participate in client data review meetings
- Manage technical discovery efforts with the client to identify pragmatic solutions for meeting client data consumption requirements
- Develop technical specifications and execute delivery of data solutions
- Define features and phases in alignment with business goals
- Provide educational sessions and technical explanations for customers - including both technical and business-oriented audiences
- Distill large sets of complex information into meaningful, bite-sized summaries
- Work to become Subject Matter Expert (SME) for all things around a Personally Identifiable Information (PII) ID Graph
- Continued learning of B2B marketing best practices
- 2+ years’ experience using SQL to analyze data and deliver both custom files and insights to customers
- Proficiency using command line-based tools
- Proficiency with setting up STP/FTP connections for file exchange
- Ability to troubleshoot and resolve issues with remote VPN servers using Cyberduck.io
- Strong software development and database experience
- Proficiency in SQL and relational databases
- Excellent analytical problem-solving skills
- Knowledge of DataBricks a plus
- Highly organized and capable of managing many moving projects at once
- Fabulous verbal and written communication skills
- Ability to adapt to a rapidly changing environment
- Hungry for career and personal growth, in a collaborative environment where feedback is welcomed


data🇺🇸usa only
About Wurl, LLC.
Wurl is a global streaming network. Our B2B services provide streamers, content companies, and advertisers with a powerful, integrated network to distribute and monetize streaming television reaching hundreds of million of connected televisions in over 50 countries. This year Wurl was acquired by AppLovin (Nasdaq: APP), an industry-leading mobile marketing ad tech company, bringing together the technology and innovation of the mobile and television industries. With the merger, Wurl employees enjoy the best of both worlds: the dynamic environment of a 160+ person start-up and the stability of a high-growth public tech company.
Wurl is a fully-remote company that has been recognized for the second year in a row as a Great Place to Work. Wurl invests in providing a culture that fosters passion, drives excellence, and encourages collaboration to drive innovation. We hire the world’s best to build a bunch of cool stuff on an interface fully integrated with their own from streaming, advertising and software technology to continue to help us disrupt the way the world watches television.
Data Science Engineer
Wurl is seeking a Data Engineer with knowledge of data science. For this role, you will design and implement technical solutions from the ground up for our Data Science platform. Wurl collects data from a range of OTT video streaming ecosystem components. We digest it, analyze it, and build a wide range of Data Science pipelines to support our products. The Data Science engineer reports to the manager of the Data Science team and interacts with Product, Solution Architects, data engineering, and various other functions across the organization.
What You'll Do
- Work alongside Data Scientists for building duplicatable and high-scale Data Science pipelines
- Collaborate with the data engineering and ML teams to construct data pipelines to implement machine learning models into Wurl’s production framework
- Build highly secured, scalable, and reliable cloud-native pipelines that run 24x7
- Write complex SQL queries with the objective of minimizing the processing effort and time of the data science team’s data pipelines
- Collaborate with the data science team to build data models and prediction models
- Create probability, statistics, predictive modeling, machine learning, or other quantitative methodologies to solve real-world TV delivery and advertisement problems
Qualifications
- Master’s degree in Computer Science, Mathematics, Physics or related field plus 4 years of relevant experience
- Expert programming skills with Python and SQL
- Expertise in Spark/PySpark
- Experience with SaaS offerings: AWS preferred. Containerization, serverless processes, microservice architectures, etc.
- Strong mathematical background and knowledge including but not limited to statistics, predictive modeling, linear algebra, machine learning
- Experience building ML models using frameworks such as TensorFlow, PyTorch, SparkML, scikit-learn
- Knowledge of Snowflake, Databricks, Git, and MLFlow
- Understanding of complex networks (mathematical structures)
- Data pre-processing, visualization, and data cleansing experience
- Experience dealing with big datasets and databases
- Effective communication skills
- Works well in a team and independently
- Soft skills: creative, innovative, open to receiving and giving constructive criticism, open to discussions, trustworthy and reliable
- Proven record of being an outstanding problem-solver
A Plus if You Have
-
- Advanced proficiency with data visualization tools (such as Tableau, Power BI, Domo)
- Streaming video delivery experience and video formats
- Advertising infrastructure experience
What We Offer
- Competitive Salary
- Strong Medical, Dental and Vision Benefits, 90% paid by Wurl
- Remote First Policy
- Flexible Time Off
- 12 US Holidays
- 401(k) Matching
- Pre-Tax Savings Plans, HSA & FSA
- Ginger, Aaptiv and Headspace Subscriptions for Mental and Physical Wellness
- OneMedical Subscription for 24/7 Convenient Medical Care
- Paid Maternity and Parental Leave for All Family Additions
- Discounted PetPlan
- Easy at Home Access to Covid Testing with empowerDX
- $1k Work From Home Stipend to Set Up Your Home Office
Few companies allow you to thrive like you will at Wurl. You will have the opportunity to collaborate with the industry’s brightest minds and most innovative thinkers. You will enjoy ongoing mentorship, team collaboration and you will understand what we mean by ‘human connection”. You will be proud to say you're a part of the company revolutionizing TV.
At Wurl, we value work-life harmony and believe that family and mental health should always come first. Our team is fiercely passionate and contagiously enthusiastic about what we are building. While we are seeking those who know our industry, there is no perfect candidate and we want to encourage you to apply even if you do not meet all requirements.


data🇺🇸usa only
< class='"content-intro"'>

At Health IQ, our vision is to ensure that the 1.5B seniors live their golden years better than the previous generations. We believe in rewarding the health conscious through savings, literacy, and educational tools.
We are a erse and innovative group of iniduals who thrive on big data and proven results. Our approach has enabled us to grow from roughly 200 to 900+ employees over the last year and we expect continued growth and opportunities. If you believe that being health conscious can improve lives and want to make a tangible difference through your work, then you’ll love what we’re doing at Health IQ – apply and join the team!
Medicare Advantage Data Analyst
Health IQ is seeking out a data analyst to empower informed decision making around our Medicare Advantage business strategy. Reporting to the Lead Medicare Actuary, this role will involve analytical work and model building to turn data into stories and actionable business recommendations. If you are a high performer with a blend of technical skills and critical thinking looking for a fast-paced, high-growth opportunity, then this is for you!
What you will be doing:
Understanding the drivers around why a Medicare Advantage customer may be unhappy with their plan is a Trillion dollar question. This role will leverage Health IQ’s unique data assets to build models and dashboards to become an expert in answering this question at a deeper level than exists today.
The first step is to build a strong foundation around our customer persistency & satisfaction metrics by developing dashboards to study patterns. Use your curiosity to ask what drove the results and utilize your analytical skills to answer those questions. Once you have developed a foundation, deep e into the data to understand underlying root causes that would result in a customer leaving a plan. Use these newly gained insights to develop models to predict future performance and share ideas to positively impact our business outcomes throughout the customer lifecycle.
Essential Duties and Responsibilities
- Utilize Health IQ’s unique data to support feature engineering to identify new factors that influence a consumer’s decision to leave vs stay with a Medicare Advantage plan. This will include working with data from our Precision Medicare plan recommendation algorithms, health records, sales funnel metrics and other unique data sets.
- Develop, own and maintain our dashboards and models around persistency and customer satisfaction. Proactively develop new dashboards to ensure we have proper insights to drive strategic decisions.
- Enhance customer lifetime value models by adding unique factors that more accurately predict persistency. Develop reports that compare actual to expected results over time.
- Receive policy status data from insurance carriers and our finance team. Cleanse data and perform quality checks prior to feeding into models.
- Develop models to understand agent level value creation, taking into account sales funnel data and customer persistency to develop a holistic view of an agent’s contribution to company success.
- Provide data visualization support and ideas for the development of presentations to inform senior leadership, pitch ideas and influence strategy inside and outside the company. There will be regular opportunities for you to provide thought leadership to influence business strategy to company executives.
- Work closely with Health IQ’s President & Chief Business Officer, Chief Actuary, Chief Medical Director, and SVP of Medicare with a unique opportunity to build business acum from senior executives.
- You will have regular Interaction with all areas that influence the customer experience and track data including sales agents, marketing, engineering, operations, data science, legal, and finance. It will be essential for you to have a pulse on their feedback to add qualitative insights to better understand your quantitative analysis.
What we’re looking for:
- 2+ years of experience in a data analytics, actuarial, financial analyst or similar role. Experience in Medicare is a plus.
- A curious mindset and motivated to learn new topics on the fly.
- Strong background in working with analytics and model building in Excel.
- Experience with coding languages such as SQL, R,
- Building out code libraries, following SDLC standards
- Knowledge of data gathering, cleaning, transforming, and visualization techniques.
- Comfortable juggling multiple tasks/deadlines simultaneously.
- Self-motivator with good judgment and a strong focus on details who wants to develop their strategic thinking.
- Able to work successfully both independently and collaboratively in a team environment.
- Takes ownerships of deliverables to see them to completion with high quality; not afraid to ask questions for clarification and guidance when needed.
- Well organized with good habits to document workflow.
- Excited to work in a fast paced, high growth startup environment.
Health IQ can not sponsor work visas, including OPT STEM at this time.
#LI-REMOTE
< class='"content-conclusion"'>To make the world a healthier place, we started in our backyard. We created a health-conscious environment that allows each of our employees to reach their personal health goals. Below are a few of the employee-led programs that make working at Health IQ truly unique.
- Career Growth
As a rapidly growing company, new opportunities for growth and development continue to become available. We believe in promoting from within, and look to reward high performing employees with new opportunities.
- Celebration
We believe the key is to celebrate those who have improved their health rather than cajole those who haven’t. We look for employees who take this positive and optimistic view in their work lives.
- Service to Seniors
Our whole mission and vision is to serve seniors to improve their health. We want employees who believe true happiness comes from being in service to others. We call these employees Health Heroes.
- Personal Responsibility
We believe that only you can make the decision to improve your own health and no one else can do this for you. We look for employees that tend to do the same.
- Excellent benefits
Competitive rates for our employees' costs toward medical, dental and vision insurance. We offer a 401K, and pay 100% of your life insurance benefit option! We also offer various Flexible Spending Account (FSA) benefits to meet you and/or your families needs. Only full-time employees are eligible for benefits.
- Join a Remote-first Culture
Our flexible, totally remote environment allows us to hire top talent throughout the U.S. The world has changed, and we’ve learned that being in an office is no longer the best way for our employees and our company to thrive.


canadadata
< class="h3">About the Company

Makeship exists to empower influencers, creators, and brands of all sizes to develop and launch limited edition products that matter to their fans. Leveraging our design, manufacturing, and marketing expertise, we work with our partners to bring their product to life through our community-powered crowd-funding platform. Each product is given a window of 21 days to be funded by the community before we produce and ship to fans worldwide. We put our brand behind every product and guarantee quality and ethical sourcing. We're profitable, have grown the team from 2 to 70 people in 4 years, and we’re growing at an average annual growth rate of 200%+.
< class="h3">About the RoleAs an early member of the Data Team, you’ll play a huge role developing a scalable data strategy and building the team that executes it. Data is one of our most important assets, and your team will enable better decision-making for all teams across the company. Every day, you’ll collaborate cross-functionally, learn more about the content creation space, and watch your work make a measurable impact.
We want this to be the best work experience of your life, so we’ll pay you well, offer great benefits, and invest deeply in your personal growth.
< class="h3">Why this Role?- Have a massive impact on the company. As the head of our data team, there will be plenty of opportunity for you to lead and implement technical projects. Your contributions will be felt right away, and will affect how we store and interpret data for years to come!
- Develop and architect significant changes. You will have the freedom to be creative and architect/develop data solutions for business problems. This will challenge you to think about your design and bring your own ideas to life.
- Join us at an epic time. We’re a profitable and growing startup with millions in revenue. We’ve bootstrapped the company from 2 to 70 employees in 4 years. Join us and experience exponential personal and career growth!
- Lead a team responsible for data engineering and analysis for key business problems
- Create quarterly plans and operationalize them in lean sprints for the data team
- Examine existing pools of data to identify gaps and opportunities, and propose changes
- Identify risks to data privacy and security and implement processes to prevent breach or loss
- Develop company wide data strategy and architecture to keep up with Makeship’s growth
- Foster a culture of data-curious decision-making across our growing organization
- Collaborate with Product, Engineering, and other departments from prototype to production
- Implement frameworks and processes to ensure data pipelines and reporting are accurate
- Break down complex analyses into digestible insights for founders and executives
- At least 3 years of relevant work experience in data engineering and/or analysis
- At least 2 years of managing teams that report directly to you
- Strong business and technical intuitions, with an ability to quickly learn new technologies
- An ability to break down complex questions into a clear methodology
- Ample experience with KPIs and reporting to executive stakeholders
- Interest in the content creation space: art, animation, gaming, and entertainment!
- Proven experience in building databases, warehouses, pipelines, visualizations, and/or ETL
- A strong understanding of relational databases, data storage, and data manipulation
- A strong ability wrangle data and find answers complex real-world questions
- Experience with cloud native infrastructure (AWS, Docker, Kubernetes, etc.)
- Experience with BI tools like Google Data Studio, Power BI, etc
- Excellent communication with both technical and non-technical stakeholders
- An ability to gather information and requirements yourself, in a fast-paced environment
- Worked in a fast-paced startup or similar environment
- API expertise for Google Analytics, Facebook, Twitter, etc.
- Worked with data from Shopify, Hubspot, or Airtable APIs.
- Experience with our technology stack (PostgreSQL, BigQuery, Kafka, AWS, GCS)
- Prior experience building and scaling ML solutions
- Work remotely anywhere in Canada and/or access any of our hubs
- Health and dental benefits 100% employer-paid
- 3 weeks of paid vacation
- 1 week of paid time off during the holidays
- 2 mental health and wellness days
- Paid time off on your birthday
- Monthly phone allowance
- $400 home office setup allowance
- Pregnancy and parental leave top-up program
- Learning and development opportunities
- Employee referral program
Makeship is committed to creating a erse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

< class="h3">Get to know us

eyeo is an open source software company whose market-leading ad-filtering technology powers products like Adblock Plus and AdBlock as well as technical integrations for leading browsers on the market. Our privacy-first products like Crumbs promote transparency and an improved browsing experience. eyeo is dedicated to giving users control of their online experience while fostering a sustainable online value exchange between content creators, publishers, advertisers and users.
In combining our User Growth partnerships and our subsidiary products, our technology reaches 250 million monthly active users.
At eyeo, we’re passionate about user agency, personal privacy and sustainability as well as keeping the web an open, fair resource for everyone.
< class="h3">How we workeyeo has colleagues based all over the world. We love our ersity and our company culture. We practice an agile, remote workstyle with work distributed in cross-functional teams that span nearly every timezone. Many of our tech teams prefer to work asynchronously. If working remotely isn’t for you, we also have two offices (Cologne and Berlin) that you can choose to work from. Twice a year we come together with our teams for Team Days and once a year we offer an offsite company retreat, Summer Week. By offering remote work and in-person sessions, we have built a hybrid culture that offers a unique dynamic of flexibility and belonging.
< class="h3">What you'll doAs a Business Intelligence and Analytics Manager, you’ll be one of ten people in the Finance Team and drive the initiatives within the whole organization. This role is a mix of management and data engineering responsibilities with a bigger focus on the first one. You will be a partner who understands the needs, propose the solutions and deliver the results for our internal stakeholders.
< class="h3">After your morning routine, you'll be expected to...- Build up Power BI as the main go-to data hub for all decision makers across the company
- Develop and execute the BI strategy and roadmap
- Prioritize, plan, and execute data analysis projects that are touching on all areas of the business
- Build and automate meaningful dashboards that will help stakeholders make effective decisions
- Provide actionable insights regarding financial, products, projects and user data to help achieve company targets
- Manage a data engineer who builds and runs the technical BI infrastructure (data warehouse, ETL processes, etc.)
- At least 5 years of relevant experience in Business Intelligence
- Ability to help teams understand their information needs
- Ability to clearly present insights to both technical and non-technical stakeholders
- A passion for understanding data and identifying trends
- Joy of working in an international and erse work environment
- Educational background in Business, Computer/Data Science or equivalent experience
- Excellent communication and collaboration skills
- Have experience with Power BI, Jira, Personio, LucaNet
- Know about IBCS reporting standards
- Have some technical understanding (Azure SQL Warehouse, MS SQL Server Management Studio, Power BI Dataflows, Power Query M, SQL/T-SQL, DAX)
- Work remotely or from one of our offices —we trust you to find what works best for you
- Budget for the following: home office and/or relocation
- Flexible working hours
- 28 days paid vacation + Volunteer day
- Your choice of hardware and setup
- Personal and professional development budget
- Monthly child care allowance for children under 6
- Offsite team days and the annual summer company retreat
- Company-sponsored hackathons
- Monetization: https://adblockplus.org/blog/acceptable-ads-explained-monetization
- eyeo in the news: https://eyeo.com/press/
- Blog updates: https://eyeo.com/blog/
- Life at eyeo: https://eyeo.com/careers/working-at-eyeo
- The team at eyeo: https://eyeo.com/team
- Open Source repository: https://github.com/adblockplus, https://gitlab.com/eyeo
eyeo is an equal opportunity employer and encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, or religion or belief. When you apply, you’ll be automatically forwarded to our recruitment platform operated by an external service provider called Greenhouse (seated in the US). Greenhouse collects some information on its website, such as anonymous usage statistics, by using cookies, server logs, and other similar technology. For more information, please refer to Greenhouse’s Privacy Policy. All documents and information provided by you are stored with Greenhouse. In order to ensure an adequate level of data protection, eyeo and Greenhouse have entered into the EU Standard Contractual Clauses (“processors”) - Commission Decision C(2010)593. You can request a copy of this by contacting us at privacy[at]eyeo.com. If you don’t want your data forwarded to Greenhouse, please do not apply. For detailed and further information, please refer to our Privacy Policy at https://eyeo.com/en/privacy.

Elevar is used daily by thousands of Shopify brands all across the world to help ensure they have an accurate data foundation they can trust and scale their business with. We simplify the complex world of tagging and event tracking.
A Web Analytics Specialist at Elevar is expected to learn on the fly with new solutions that Elevar delivers, adapt to changes in our industry (like iOS14 and server side tagging!), and help maintain our expertise in onsite tagging and data collection. Your ideas and expertise are critical in our growth!
The person who fills this role should enjoy working with Google Analytics, Google Tag Manager and of course - data. You will be responsible for tagging implementation projects, onboarding new customers to Elevar, troubleshooting complex tracking issues, and ensuring customers are maximizing their value out of Elevar.
Why Work Here?
- We’re a young, ambitious company who prides ourselves on learning and solving complex challenges in the world of data collection
- You want to learn new skills and have a voice in our product roadmap
- Opportunity to have a big impact on our growth (we've more than 2x'd every year since 2019) and advanced your career (opportunities to grow vertically in your expertise or across other roles in company)
- Transparent and open organization
- Since you work so hard, no questions asked unlimited PTO for mental breaks and relaxation
- We promote a flexible work culture for everyone, including the option to pick the time that work best for you
Our Company Values
- Accountability: Being accountable to our customers, teammates, and ourselves is part of the core of Elevar.
- Detail Oriented: Careless mistakes and rushed oversight can be expensive. Measure twice, cut once.
- Positive Energy: We believe in making our work fun and being a lighthouse of positive energy for customers.
- Healthy Life: If health is exercising, reading, spending time with family, or traveling - make time for it. Unplug and recharge. Stay balanced.
- Keep it Real: We believe keeping it real is the best way to communicate. Express your feelings, respectfully.
- GSD: Our work is our pride. Getting stuff done is progress in the face of perfection.
What To Expect:
- 70% acting as a web analytics specialist on a dedicated team overseeing the onboarding, implementation, and ongoing support of Elevar for an assigned set of customers.
- 20% expanding your analytics skills, tracking knowledge, and Elevar product expertise, and then implementing these solutions across customers.
- 10% helping with our self-serve customer base tagging questions, and being a mentor to the support team by helping with internal training and documentation
Requirements
- Own the relationship of an assigned set of customers as part of your core team of specialists, and be the primary point of contact and voice within Elevar for analytics or tagging requests.
- Onboarding and activate customers onto Elevar's tracking platform
- Build and maintain strong, long-lasting relationships with your customers by understanding their tracking goals and pain points in order to deliver solutions that help improve their business.
- Execute Google Analytics & GTM audits and implement recommended changes across client accounts.
- Provide backup support and assistance to other analysts.
- Be ready and excited to learn and expand your knowledge in eCommerce data tracking. Elevar is on the cutting edge of tracking and a leading voice in the industry, and we are looking for someone who is eager to learn from our experienced team.
What You'll Need
- Self-motivated and proactive mindset.
- 3+ years experience in a customer-facing success role (Support, Success, Account Management, or Sales).
- Exceptional multi-tasking and context-switching capabilities
- Strong collaboration and communication skills
- Measurement, reporting, and other data analytics skills
- Experience working in a startup environment
- The desire for committing, over-delivering, and following up with customers to make sure they’re successful
Benefits
- Competitive salary, 100% health/dental/vision insurance for US employees, and 401K option
- Annual company retreat, and other home office perks such as a $2,500 stipend for new employees to use towards their home office that you get to keep
- We invest in your development with a $1,000 per year professional development stipend for conferences, courses, or anything that may help you grow

About Yay Lunch:
We are Yay Lunch, a new business concept with the mission to provide healthy food for kids away from home, beginning with school lunches. Our mission is simple: deliver delicious, healthy lunches made by local vendors to kids at school in an easy, reliable, cost-effective way for parents. How do we do it? We curate menus with local vendors in each market, families order online, and we deliver fresh lunches daily to kids at school. Easy!
Since launching a pilot in the 2017 / 18 school year, we have expanded rapidly. We have aggressive growth plans backed by experienced venture capital partners, and we are excited about our prospects!
We're a startup which means we are agile, outgoing, quick on our feet, and eternally positive. Our business is exploding into new cities up and down the east coast, which means we're also pretty excited!
To enable this work, we are hiring a passionate problem-solver who will help take our analytics platform to the next level. Do you have great ideas and the enthusiasm to build them? Join our team! Together, we will brainstorm, collaborate, design and deliver solutions that get great kids great food, right when they need it most.
About the team:
We take our work seriously, while recognizing that work is just one part of being a real human. We rise and fall as a team.
If you have strong skills, seriously enjoy a variety of work both tactical and strategic and don't mind getting your hands dirty, can both teach and learn from others, and can own both your successes and your mistakes, this may be a good spot for you. We have a “no stupid questions” policy and encourage egoless collaboration.
In this role, we would expect there to be minimal travel, but expect to get your chance to periodically ride along with a driver, visit a food vendor, and help to deliver to a school near you to stay in close contact with how our business works.
We firmly believe that talent comes in a great variety of forms from a wide variety of backgrounds, experiences, perspectives, and lifestyles.
About the Role:
The Senior Manager of Data Science and Business Analytics will play an integral part in positioning Yay Lunch for continued success in the future. In this role, you will be responsible for building, owning and evolving the overall analytical reporting team and environments for the Yay Lunch business. You'll help drive actionable insights for the executive team that provide not only transparency into existing business performance, but that also help to model our possible futures, and you'll help lead the creation of a robust and scalable analytics platform.
This is both a hands-on role as well as a leadership one: expect to be involved at all levels of projects.
The ideal candidate will be able to thrive in ambiguity, own and execute multiple concurrent deliverables (both hands-on as an inidual contributor and by mentoring and guiding the team) while working effectively with executive leaders and stakeholders across the company. You'll ensure all stakeholder reporting needs are met in a repeatable and scalable way, while planning, delivering, and evolving our data management and reporting environment to meet the needs of our rapidly growing business and driving actionable insights that make a real impact on our business.
While a remote role, we will be hiring a candidate in one of our existing markets: Atlanta, Baltimore, Philadelphia, DC metro areas or Virginia, Maryland or New Jersey.Key objectives:
- Partner with executive leadership and business stakeholders to develop analytics and standardized reports to support business performance reviews and to answer key business questions with a focus on scalable, actionable insights
- Own the overall design and support of the analytical reporting ecosystem, including collaborating with engineering and functional business leads on data acquisition, data management, technical architecture, data warehousing and report delivery solutions.
- Evaluate existing tools and maximize their use and/or identify new solutions appropriate for a scrappy, budget-conscious startup
- Help create predictive models that influence future business success, Personally drive top level business analytics that deliver transparency into how we are performing against our plans.
- Develop self-service reporting capabilities for the business stakeholders and mentor local teams in their use
- Help guide the executive team in establishing, implementing and maintaining data governance practices & policies, and develop reporting to drive alignment
- Help shape future technology work to both deliver data needed for future analysis, and to leverage the right data and algorithms for future automation and A/B testing
- Manage team priorities across multiple stakeholders, workflows for incoming requests, and staffing
Minimum Qualifications:
- 5-7 years of data science, business intelligence, business data governance, and/or analytics experience, with some experience in fast-moving start-up environments and some in larger companies. Consulting experience is welcome.
- Experience creating and leading a data science/business intelligence function.
- Experience and comfort generating actionable insights from complex datasets, and the creation and management of large scale data modeling. Experience creating predictive models of success based on past business performance
- Comfortable presenting both high level and detailed concepts to a broad variety of audiences, including executive leadership
- Strong experience with SQL, R, Python, and other data languages, cloud warehouses, data transformation
- Hands-on experience with reporting tools (Snowflake, Mode, Looker, Tableau, PowerBI, Github, dbt.) and Google Analytics.
- Strong project and program management skills.
- Experience coaching, mentoring, and directing a team of direct reports
- A strong sense of personal humility.
- Flexibility, curiosity, adaptability. Comfort with ambiguity and with creating tools and processes to build a path to success for the department. Readiness for change.
- Ability to be effective in an organization whose staff all work remotely
- Bachelor's degree in a relevant field and/or equivalent experience
- Preference will be given to candidates in or near an existing Yay Lunch Market location.
We provide:
- Salary range $75,000 - $100,000 depending on experience
- Bonus + company options
- Medical, vision, dental and life insurance
- 401K, up to 3.5% employer match
- Flexible paid time off (unlimited time off program)
- Technology allowance, data stipend, home office allowance
- Remote, flexible, and family-friendly environment
- Significant support structures, on-the-ground training, & coaching from central team members
Yay Lunch requires all team members, including new hires, to report their COVID-19 vaccination status
#ZR


location: remoteus
Title: Data Entry Operator 2
Location: US National – Remote
Change Healthcare is a leading healthcare technology company with a mission to inspire a better healthcare system. We deliver innovative solutions to patients, hospitals, and insurance companies to improve clinical decision making, simplify financial processes, and enable better patient experiences to improve lives and support healthier communities.
Data Entry Operator II
What will I be responsible for in this job?
- Enters data from a source document into a computer system.
- Verifies entries for completeness.
- May convert data from one electronic system to another.
- Performs audits of own work or that of others to ensure conformance with established procedures.
- May work with stand alone data systems or enterprise-wide tools supporting activities such as marketing lead tracking, fulfillment, manufacturing, and equipment maintenance or materials transactions.
What are the requirements?
- 1+ years of experience in data entry
- High School or equivalent
What critical skills are needed for you to consider someone for this position?
- Ability to multitask, work independently and efficiently, also be able to clearly communicate with the team to ensure collaboration on projects.
- Be able to follow directions and troubleshoot independently any items that may fall outside of the normal scope of the directions.
- Effectively be able to communicate with management team on progress and any issues needed to be resolved either independently, as a team, or with assistance from management.
What other skills/experience would be helpful to have?
- Experience with excel, IDK/Group Management, experience with navigating and utlilizing various websites independently.
- Work from home or remote experience is a plus.
How much should I expect to travel?
- None
COVID Vaccination Requirements
We remain committed to doing our part to ensure the health, safety and well-being of our team members and our communities. As such, some iniduals may be required to disclose COVID-19 vaccination status prior to or during employment. Certain roles may require COVID-19 vaccination and/or testing as a condition of employment. Change Healthcare adheres to COVID-19 vaccination regulations as well as all client COVID-19 vaccination requirements and will obtain the necessary information from candidates prior to employment to ensure compliance.
Location: Remote
Work anywhere in the US, must live in the US
Join a growing IT Operations team who supports a company of ~150 people with industry-leading web applications in areas of subscription-based continuing education, simulation and training, eCommerce, content management, and more.
ABOUT THE COMPANY
A leader in online education, Continued is composed of passionate iniduals dedicated to providing an exceptional learning experience that enhances the lives and careers of the professionals we serve. We are fast-paced, nimble, and laugh a lot. We all work from home offices in various locations across the country. We are dynamic, engaged, highly motivated, and highly skilled. We are passionate about our work and our professions and we're looking for another team member to join our cause. We are certified as a great workplace, ranked #100 in Fortune's 2022 Best Small & Medium Workplaces for Millennials, ranked #86 in Fortune's 2021 top 100 Medium-Sized Workplaces in the US and ranked #13 in Fortune's 2020 top 25 Small & Medium Workplaces for Women by the independent analysts at Great Place to Work®. Learn more about us at continued.com and parent company lacallegroup.com.
We celebrate erse voices within our organization, and in the professions we serve, and we welcome and encourage applications from candidates of all backgrounds. We strive to be an inclusive workplace where our team members feel empowered and supported in their careers. We do not discriminate based on race, ethnicity, national origin, sex, gender identity, sexual orientation, religious creed, disability status, veteran status or age. We are also committed to paying our team members at fair rates based on their skills and experience.
ABOUT THE TEAM
You will be joining the IT Operations team; though we are virtual and scattered across the country, we communicate frequently, have a tight-knit team, and support each other across roles and projects. We are driven by endless curiosity, a love of learning, and a desire to solve problems.
ABOUT THE POSITION
We are seeking an experienced Database Developer who is motivated and creative to support the database needs of our growing company. You will administer our database environments as well as help improve our data operations. This position reports to the IT Operations Manager and works closely with our Senior Data Engineer, Data Reporting Analyst, Systems Engineers, and Development teams.
- Full-time W2 position, work core U.S. business hours in your time zone
- Ability to fulfill the job requirements with or without reasonable accommodations
- Work from your home office
- Company-provided MacBook Pro, Internet stipend, etc
- Must be located in the United States and be legally eligible to work for us
- Travel is required for an annual company meeting, and may be required once or twice per year for team or project meetings
- We offer comprehensive medical, dental, and vision group health insurance, flexible spending accounts, telemedicine, accident insurance, and more
- We offer 21 days of flexible paid time off per year (increases with tenure) as well as your birthday and holidays (you may substitute any of the standard holidays for another public holiday such as bank/school or other regional or religious holidays)
- We offer a 401k match, professional development, charitable gift matching, robust wellness programs, and more
RESPONSIBILITIES
- Administer our MySQL Database environments, including backups, routine maintenance, recovery planning/testing, upgrades, and optimization.
- Create and optimize queries, processes, and procedures to aid in improving system functionality and performance.
- Work with Development teams to ensure complex queries and reports are optimized for scale.
- Work with Security Analysts & Systems Engineers to ensure the security, and reliability of our databases.
- Help drive and support our long-term data strategies by working with Data Engineers and Data Analysts to support data warehousing, analytics, and reporting.
REQUIREMENTS
- Strong experience optimizing complex queries and reports in a scaling environment
- Highly attentive to details, able to cross all the t's and dot all the i's and help review/fix issues with others' SQL queries
- There is direct project oversight, but you are expected to take ownership of your work and your projects, establish deadlines, and communicate progress proactively
- Strong self-management skills, capable of thriving in a virtual work environment
- Minimum 3 years of experience doing SQL-related work with some of the following (you are not expected to have extensive experience with all of these tasks):
- Experience administering SQL databases
- Experience optimizing complex queries and reports in a production environment
- Database server assessments, health checks, performance analysis, and security audits
- Backup & maintenance configuration
- Stored procedure development
- Knowledge of security best practices and an understanding of how to implement them
- Structured, disciplined approach to development, testing, and deployment
Preferred, but not required
- MySQL administration experience
- Experience designing and implementing databases according to industry best practices and end-user requirements
- RBAC implementation and management experience
- Experience working with: AWS (RDS, Aurora), Snowflake, Domo
TO APPLY
To apply, please submit a cover letter with your application. In your cover letter, we'd love to hear about: 1) why you would make a great addition to our team; and 2) a project achievement you are particularly proud of and why AND/OR something you've learned recently, etc.



location: remoteus
Billing Representative
Location: Work from Home/Remote
PeopleShare is actively interviewing for a Remote Billing Representative for a great Non-Profit!
Job Details for the Billing Representative:
- Schedule: Monday-Friday 8AM-4PM
- Pay Rate: $20-22 per hour based on experience
- Contract-to-Hire role!
- 100% Remote role with equipment provided.
Job Responsibilities & Description for the Billing Representative:
- This role ensures a smooth and compliant billing process through timely data entry, daily monitoring units of service in the database, spot checks, data integrity audits, and reconciliation of all billing-related issues with contracted external agencies to ensure complete and timely payments to the company.
- Monitor and reconcile provider entries and run remittance reports to ensure proper billing and payment for assigned provider accounts as a billing representative.
- Provide data to and collaborate with Billing Manager on the management of billing platform quarterly reporting including, but not limited to, contractor costs – Service Verification Notes (SVN) from service delivery database for Philadelphia and Chester.
Job Requirements for the Billing Representative:
- Must be Vaccinated.
- High School Diploma is required
- Two (2) years of experience in third party billing and collection procedures
- One (1) year of experience with Medical Assistance (MA) billing
- Non-profit human service experience preferred
- Intermediate experience with Excel is required.
< class='"content-intro"'>

Fastly helps people stay better connected with the things they love. Fastly’s edge cloud platform enables customers to create great digital experiences quickly, securely, and reliably by processing, serving, and securing our customers’ applications as close to their end-users as possible — at the edge of the Internet. The platform is designed to take advantage of the modern internet, to be programmable, and to support agile software development. Fastly’s customers include many of the world’s most prominent companies, including Vimeo, Pinterest, The New York Times, and GitHub.
We're building a more trustworthy Internet. Come join us.
Data Governance Manager
We are looking for a Data Governance Manager to help evolve and mature Fastly’s data governance program. In this role, you will partner with stakeholders across the organization to identify our key data assets and promote good data stewardship, helping the company understand and maintain appropriate data protection, privacy, and handling measures. You will empower teams to incorporate data-focused security considerations, regulatory requirements, and customer obligations into their systems and workflows in straightforward, repeatable, auditable ways.
This position reports to the Senior Director of Compliance within the Security organization.
What You'll Do
- Build and evolve our enterprise-wide data governance program, spanning data lifecycle management in our customer-facing platform, as well as our internal systems and data stores
- Partner with stakeholders across the business to catalog key data assets and understand how this data traverses our systems, environments, and processes
- Centralize and harmonize data protection, privacy, retention, and handling requirements from Security, Compliance, Law, Finance, Engineering, and other departments into a cohesive data governance framework
- Drive implementation and management of operational aspects of the data governance program, including data protection impact assessments, data mappings, and data considerations in our product documentation
- Develop and publish self-service, playbook, and guidance material to help others more easily meet data governance considerations tied to their job responsibilities
- Facilitate completion of audit procedures and responses to customer inquiries pertaining to data governance and privacy topics
What We're Looking For
- Several years of relevant work experience in establishing, maintaining, and/or assessing data governance and privacy programs
- Tenured experience supporting data governance programs or needs at a modern cloud service provider or similar organization
- Demonstrated experience building or supporting data governance programs from an early stage of maturity
- Ongoing awareness of industry and regulatory trends in data governance domains, including best practices and tools
- Proven ability to work with technical contacts and interpret technical concepts while establishing or executing data governance processes
- Demonstrated proficiency in prioritizing and completing multiple concurrent projects and tasks
- Strong communication and interpersonal skills, both written and verbal
We value a variety of voices, so this is not a laundry list. It would be an added bonus if you have experience in ANY of these:
- Data privacy-focused certifications, CIPT in particular, strongly preferred
- Experience with Atlassian products (Confluence, Jira) for managing project work and maintaining process documentation
- Experience using GRC tools to maintain controls documentation mapped to relevant regulations, industry standards, or risk areas
- IT audit or information security certifications (e.g., CISA, CISM, CISSP)
The estimated salary range for this position is $113,000 to $170,000. Starting salary may vary based on permissible, non-discriminatory factors such as experience, skills, qualifications, and location. This role may be eligible to participate in Fastly’s equity and discretionary bonus programs.
Benefits
We care about you. Fastly works hard to create a positive environment for our employees, and we think your life outside of work is important too. We support our teams with great benefits that start on the first day of your employment with Fastly. Curious about our offerings?
- We offer a comprehensive benefits package including medical, dental, and vision insurance. Family planning, mental health support along with Employee Assistance Program, Insurance (Life, Disability, and Accident), company paid holidays, paid time off and paid sick leave are there to help support our employees. We also offer 401(k) (including company match) and an Employee Stock Purchase Program.
Fastly reserves the right to amend or modify for any reasons in accordance with applicable law.
< class='"content-conclusion"'>Why Fastly?
-
We have a huge impact. Fastly is a small company with a big reach. Not only do our customers have a tremendous user base, but we also support a growing number of open source projects and initiatives. Outside of code, employees are encouraged to share causes close to their heart with others so we can help lend a supportive hand.
-
We love distributed teams. Fastly’s home-base is in San Francisco, but we have multiple offices and employees sprinkled around the globe. In fact, 50% of our employees work outside of SF! An international remote culture is in our DNA.
-
We care about you. Fastly works hard to create a positive environment for our employees, and we think your life outside of work is important too. We support our teams with great benefits like up to 20 weeks of paid parental leave, options for free medical/dental/vision plans, and an open vacation program that enables our folks to take the time they need to recharge (some benefits may vary by location).
-
We value ersity. Growing and maintaining our inclusive and erse team matters to us. We are committed to being a company where our employees feel comfortable bringing their authentic selves to work and have the ability to be successful -- every day.
-
We are passionate. Fastly is chock full of passionate people and we’re not ‘one size fits all’. Fastly employs authors, pilots, skiers, parents (of humans and animals), makeup geeks, coffee connoisseurs, and more. We love employees for who they are and what they are passionate about.
We’re always looking for humble, sharp, and creative folks to join the Fastly team. If you think you might be a fit, please apply!
Fastly is committed to ensuring equal employment opportunity and to providing employees with a safe and welcoming work environment free of discrimination and harassment.
Employment decisions at Fastly are based on business needs, job requirements and inidual qualifications, without regard to race, color, religion, gender, gender identity, gender expression, sexual orientation, national origin, family or parental status, disability*, age, veteran status, or any other status protected by the laws or regulations in the locations where we operate. Fastly encourages applicants from all backgrounds.
*Applicants with disabilities may be entitled to reasonable accommodation under the terms of the Americans with Disabilities Act and certain state or local laws. A reasonable accommodation is a change in the way things are normally done which will ensure an equal employment opportunity without imposing undue hardship on Fastly. Please inform us if you need assistance completing any forms or to otherwise participate in the application process.
Fastly collects and processes personal data submitted by job applicants in accordance with our Privacy Policy. Please see our privacy notice for job applicants.


datauk
Zego is a commercial motor insurance provider that powers opportunities for businesses, from fleets of just two vehicles to global enterprises, and for inidual drivers and riders. Its mission is to provide businesses and people with insurance they control, saving them both time and money.
The problem that exists is that in an ever-changing world, traditional insurance holds businesses back. Zego, on the other hand, helps businesses to unlock their full potential by putting them in control. Using smart technology and sophisticated data sources, Zego gives businesses the power to monitor and improve their driving performance over time, enabling them to save money by retaining a great price for their cover. It also helps businesses save time, making the administrative side of things easy and the claims process effortless.
Since its inception in 2016, Zego has grown to support businesses in the UK and across Europe and has forged partnerships with businesses such as BP, Amazon and Uber. Zego has also raised $280 million in funding and was the first UK insurtech to be valued at over $1 billion.
Overview of the B2B Data team
The mission of the B2B Data team at Zego is to leverage novel data sources and modern computational tools to deliver value through data-driven insights for Business to Business insurance products. It is a growing team, responsible for all the aspects related to data, including data sourcing, cleaning, extraction, problem formulation, implementing and testing models, productionising new exciting tools and algorithms and designing scalable tools, practices, and systems to support data science analysis and deployment. The team partners closely with Product, Engineering, Risk Management and wider Commercial and Business functions to find the best ways to incorporate statistical models and evidence-based decision making into Zego’s products and processes.
Purpose of the Data Scientist:
Our Data Science team is looking for an ambitious and experienced Data Scientist to join their growing team to help work on the development of insurance risk models.
The Data Scientist will be working closely with Data Scientists, Engineers and Risk Managers to improve Zego’s risk identification and selection capabilities. The Data Scientist will play an integral role in researching, testing and deploying new models and powering Zego’s risk strategies through data-driven insights and evidence-based decision-making. Therefore experience working as a Data Scientist and using Python, and SQL along with an understanding of good development practices (git, code review, documentation) is essential.
The Data Scientist will also be working on projects involving data acquisition, risk management, behavioural modelling, actuarial risk assessment, thus offering a wide range of exposure and plenty of chances to develop as a data professional. In doing so we will be given the support and backing needed to develop your skills, including on the job training, L&D budget, and a personal career development plan.
This is a great opportunity for an experienced Data Scientist who would like more variety in their role, wants to leverage novel sources of behavioural data and state-of-the-art machine learning techniques and importantly would like to fashion a strong career in data with a company that is disruptive and who likes to do things a little differently.
What you’ll be working on as Data Scientist:
- You will e into the heart of Zego R&D efforts and help to build market leading behavioural and risk models
- You will work closely with the Senior members of the Data team to design, develop, deploy and maintain industry leading statistical and machine learning models
- You will work with various types and erse datasets, including claims, driver behaviour, shift and trip information, traffic, weather, telemetry, geospatial data, web and smartphone sensors
- You will contribute to improving Zego’s understanding how driver behaviour relates to actuarial risk and how Zego’s can leverage the insights to disrupt the way the industry currently operates
- You will work with the team Data Engineers and Analysts to action the models and insights produced by Data Science
What you will need to be successful as a Data Scientist:
- You have a BSc/BA in Mathematics, Statistics, Engineering, Computer Science or similarly quantitative disciplines
- Experience working as a Data Scientist is a must
- You are proficient working with and querying structured data using SQL
- Experience using Python applied to data analysis and Machine Learning
- You are familiar with common statistical tools and methods, such as Hypothesis Testing, Generalised Linear Models (GLMs)
- You are familiar with modern programming techniques and practices
What it's like to work @ Zego:
Zego has a truly international and inclusive team, unified by great ideas and collaborative thoughtfulness. Our people are the most important part of our story and everyone plays an essential role in our journey. We look for people who have expertise, enthusiasm and who are motivated by change. There’s plenty of room to learn and grow, as part of our ongoing training programmes or directly from other experts. You’ll work alongside a talented group of people who respect each other's differences and seek to understand fresh perspectives.
Where we work:
We are committed to helping all Zegons deliver their best work regardless of location which is why we have embraced a ‘Flex First’ philosophy. This means home is the primary place of work for all Zegons but you can also spend time with colleagues in our city centre collaboration spaces from time to time too.
We are an equal opportunity employer and value ersity at our company. We do not discriminate on the basis of race, religion, national origin, gender, sexual orientation, age, marital status, or disability status.
#LI-REMOTE
#LI-AS1

Title: Administrative Data Entry Assignment Coordinator
Location: United States – Fully Remote
Job Type: Full-time, Temporary
Description
Temporary w/ potential to become permanent – Full-time Fully Remote Must be willing to work 8:00 a.m. to 5:00 p.m. Pacific Time
FRASCO Inc. is seeking an energetic and organized inidual for the position of Team Lead for our Assignment / Date Entry Group. This inidual must thrive in a fast-paced environment, love technology and want to advance with a dynamic, fast-growing company. This candidate must have keen attention to detail and be a fast learner.
ABOUT US:
Frasco is a full-service investigation corporation serving the insurance, legal, employment, and entertainment communities. Services we provide include surveillance, statement & interviews, activities checks, background checks, and other investigative endeavors.
The Administrative Data Entry Assignment Coordinator is a fully remote position; however, the applicant must be willing to work 8:00 a.m. 5:00 p.m. in the Pacific Time Zone.
Administrative Data Entry Assignment Coordinator Responsibilities:
Duties/Responsibilities:
- Assesses incoming referrals and when complete, assigns to the team to be opened in an accurate, efficient and timely manner.
- Works with clients, managers and sales staff to obtain missing information on referrals.
- Assists manager in maintaining and updating system-wide database.
- Ensures confidentiality and security of sensitive data and reports.
- Answers and directs phone calls.
- Performs other duties as assigned.
Requirements
Required Skills/Abilities:
- Excellent organizational skills and attention to detail.
- Strong supervisory and leadership skills.
- Thorough understanding of data entry methods, equipment, and procedures.
- Proficient with Microsoft Office Suite.
Education and Experience:
- High school diploma or equivalent required; Associates / Bachelor’s degree preferred.
Compensation and Benefits
- Competitive hourly rate
- Medical Benefits including Dental and Vision
- Life Insurance
- FSA
- Paid Time Off
- Paid Holidays
- 401k plan with matching
- Fun and positive work environment
This is a Full-Time Hourly / Non-Exempt Position. Join a leader in the investigations industry that is financially stable, is experiencing growth, and whose history spans over 55 years! All replies will remain confidential. EOE
Join us in creating better futures through bold decisions.
We’re a data technology studio that focuses on helping our clients create operational change (faster) with their data. We’ve worked inside some of the world’s largest companies: Walmart, Zappos, Tyson, Chick-fil-A, H-E-B, and many more.
Our mission of creating better futures isn’t something we just seek for clients, but for ourselves as well. We are proud to be a collective of high-performers that champion necessary change when we see it. We strive to model new (better) ways of working within our own company — taking a people‐first approach to everything we do.
We’ve been named a Best Place to Work for four years and made Entrepreneur’s list of “Top Company Culture” winners. We’re also proud to have made the Inc. 5000 list of America’s Fastest‐Growing Private Companies for four years in a row (2017, 2018, 2019, 2020).
< class="h3">About the Role:
The Sr Data Engineer role requires both great technical skills as well as a consultative and leadership based approach to work and teams. In this role you can expect to be engaged in strategic planning of architecture and systems with an eye for rapid development and performance, as well as demonstrating technical skills while creating well thought out data pipelines and repositories, particularly in the context of warehousing and analytics.
It is imperative that you have a solid understanding of database architectures, being able to identify technology fit for client solutions. Solid fundamentals in relational and non relational design are critical. You will be working closely with teams having varying levels of experience with data systems, both internal RevUnit teams as well as client teams, and will need to be able to model best practices. Being a good communicator and team player is important to your success in the role.
You will have opportunities to help support machine learning and be involved in defining and building data pipelines on these projects. Knowledge in this space is not essential, but being interested and willing to learn how your craft fits into the larger picture is. We value folks who are passionate about what they do, are ready to offer a hand when needed, and are happy to share their knowledge with the rest of the team. We are looking for someone who has an interest in creating their own destiny by helping grow our data engineering practice.
< class="h3">Key Responsibilities:
What you’ll be doing (mostly)
- Work as an Azure cloud data engineer, an inidual contributor, and a team player
- Analyze, design, and determine coding, programming, and integration activities required based on specific objectives
- Manage data pipeline jobs failure issues
- Python jobs via Docker container and optimize it for better performance
- Develop processes, techniques, and tools to analyze and monitor platform performance
- Strong working knowledge in Azure Data Warehouse and Lakehouse formations
- Experience in managing databases and objects
- Experience Azure Data Lake Storage with Azure Synapse
- Develop Azure Data Factory/Synapse pipelines and experience with Azure SQL Pools, Apache Spark
- Identify, troubleshoot and resolve issues related to slow / failed jobs
- Support ingestion pipelines from data engineering standpoint
- Experience with SQL (Queries, Functions, Stored Procedures) and Python languages
- Participation in the design of data models for reports
- Experience with Azure Functions and API integration management
- Support architecture for Azure Event Grid and Hub
- Understanding of Identity Management, Security, Data Governance
Requirements
- Desire to live into the RevUnit values of: Do what’s right; be fearless; a little better all the time; I’ve got your back, you got mine; and celebrate the wins
- 4+ years experience with:
- Data modeling and architecture experience
- Azure Data Factory
- Azure Data Lake (ADLS2) Technologies
- Python, R, or similar technologies
- Cloud platform experience with data on AWS, Azure, or GCP
- Database performance tuning and query optimization
- Data transformation patterns
- Data analysis, discovery,
- Relational and non-relational database principles and design
- Understanding of how data engineering fits within DevOps systems and processes
- Demonstrable experience working with cross-functional application engineering teams
- Demonstrable experience working with infrastructure supporting data storage, data warehousing, data for interactive applications , analytics applications, etc.
- Demonstrable experience working on teams and conveying technical topics to both technical and non-technical audiences
- Ability to think one step ahead and act strategically
- A strong aptitude and interest in independent continuous learning, staying up to date on relevant technology, trends and practices.
Nice to haves:
- GCP certification
Benefits
- RevUnit pays 100% of your inidual premiums for Medical, Dental, Vision, and both Short and Long-Term Disability
- Basic Life and AD&D
- Health Savings Account (HSA) option
- 401k matching program
- Employee Assistance Program (EAP)
- Unlimited paid time off
- 8 Paid holidays
- Paid parental leave
- All Hands Meetings (All-Company. 2x per year. Focus on skill development, team building, and celebration.)
- Company laptop, monitor, and accessories

< class="h1">Description

This is a remote position.
Extensive experience of 5+ in Power BI with overall experience of 10 plus years in BI • Extensive experience of BI tools like Power BI, SSRS, Tableau etc. • Hands-on experience in RDBMS platforms i.e. SQL Server, Oracle etc. • Experience in Design solutions for aggregated facts using metadata. • In-depth knowledge of database and BI architecture design • Worked on reports dashboards migration from tradition to modern BI tools like Power BI, Tableau etc. • Managed high volumes of data at one time load • Worked on Optimize data modelling, database and BI solutions • Applying architectural and engineering concepts to design a solution that meets operational requirements, such as scalability, maintainability, security, reliability, extensibility, flexibility, availability and manageability • Developing technology specifications and ensuring that new technology solutions are designed for optimal access and usefulness, and leveraging existing technologies when possible • Good understanding of Data Visualization tools. • Ability to develop Cloud Data and BI solutions as per enterprise requirement. • Participate in development of business intelligence solutions • Ability to create and maintain conceptual business, logical and physical data models. • Experience in translating mapping relational data models into Data Base Schemas. • Develop and create transformation queries, views, and stored procedures for ETL processes, and process automations< class="h1">Skills
Power BI Architect
< class="h1">EducationBachelors
< class="h1">Experience5+ years

We are looking for a Data Engineer proficient in SQL, SSIS, and SSRS. Alongside your strong leadership, communication, and mentorship skills you will assist us with the design, development, and implementation of data solutions.
Role and Responsibilities:
- Technical implementation: file exchange, data integrations, client configuration
- Create client-specific and internal static and ad-hoc reports
- Initiate and support ad-hoc and recurring non-automated tasks
- Ticketing system triage and assignment of tickets
- 1st-level operational support – password resets, ad hoc queries, and research, etc.
- Analysis, design, development of client-specific processes and tools
- Recognition and escalation of core platform modifications
- File specifications communications modifications and “negotiation”
- Creation, Sending, Receipt, testing, and feedback on all test files
- Perform other duties and responsibilities as required by the company
Skills:
- 2+ years' work experience with Data Management
- 2+ years working with SQL Server, Transact-SQL, SSIS, and SSRS
- Significant experience with Configuring complex systems
- Experience with requirements analysis and technical design
- Understanding and work experience with various SDLC methodologies
- Experience working as a member of a multi-member, cross-department project team
- Desirable – Skills and work experience with SharePoint
- Desirable – Experience with Project Management and associated tools
- Desirable – Experience doing internal technical product support

< class="h3">Company Description
John Snow Labs is an award-winning AI and NLP company, accelerating progress in data science by providing state-of-the-art software, data, and models. Founded in 2015, it helps healthcare and life science companies build, deploy, and operate AI products and services. John Snow Labs is the winner of the 2018 AI Solution Provider of the Year Award, the 2019 AI Platform of the Year Award, the 2019 International Data Science Foundation Technology award, and the 2020 AI Excellence Award.
John Snow Labs is the developer of Spark NLP - the world’s most widely used NLP library in the enterprise - and is the world’s leading provider of state-of-the-art clinical NLP software, powering some of the world’s largest healthcare & pharma companies. John Snow Labs is a global team of specialists, of which 31% hold a Ph.D. or M.D. and 75% hold at least a Master’s degree in disciplines covering data science, medicine, software engineering, pharmacy, DevOps and SecOps.
< class="h3">Job DescriptionWe are looking for a US-Based Rockstar data researcher in the healthcare space, who has the clinical and academic background to find & annotate useful data sets. The core aspect of this role is the annotation of clinical notes & other free-text documents - in order to make it easier for software, analytics & data science teams to train natural language processing models.
Key responsibilities:* Label and annotate clinical notes and other healthcare documents for symptoms, diagnoses, procedures, drugs, allergies, adverse events, vital signs, and lab results* Extract multiple useful data sets from files, publications, and other sources* Write clearly to explain the content, values and schema of each data set* Collaborate with project & release managers to prioritize which data sets to research & annotate* Communicate gaps in annotation guidelines, edge cases, and suggestions for improvement of the data research and annotation project for the whole team.< class="h3">QualificationsQualifications:
* Based and located in the United States* Proven healthcare domain expertise - preferably as a clinician, although clinical coders and health data researchers are also welcome to apply* Strong attention to detail and understanding of detail clinical, operational, and financial healthcare concepts* Familiarity with the home health industry is preferred* An advanced degree such as a PhD, MD or similar professional or academic experience is preferred* Concrete & proven experience in data research, in areas such as data curation, database modeling / querying, statistics, data science or academic research.* If you are hired and contribute data to our repository, your name, photo & short bio will appear as a contributor on our website - please do not apply if this is problematic.* A commitment for 40 hours/week is required* Only inidual freelancers please - no agencies, teams, or companies* This can potentially be a long-term relationship - we are looking to significantly grow our team - and may grow to include other writing or research assignments.If you are interested, please apply with a cover letter that includes the words "John Snow Labs" and explains why you are the right person for this role.< class="h3">Additional InformationWe are a fully virtual company, collaborating across 22 countries
Looking for U.S. based applicants on this role.
This role requires the availability of at least 30 hours per week.

Data Marketing Expert (German)
Culture Flipper is looking for a Data Marketing Expert (remote).
We are seeking one Data Marketing Expert who will review materials related to machine learning and AI used to boost online marketing as a subject matter expert. German must be your primary language.
Job Description
As a Data Marketing Expert, you will work with our German language experts and writers to:
- Review terminology used in content related to machine learning/data marketing tools to ensure that German audiences (especially online marketing professionals) can clearly understand the content clearly.
- Run quality audits by reviewing German translations of various content related to the machine learning/data marketing tools in whole or in part, and share detailed feedback (including revisions) with our language experts, terminologists and writers to provide insights to the general audience.
- Highlight, edit or add information most relevant to German audiences based on your experience in the local German data marketing industry to content originally written for English audiences.
Qualifications
- At least 3 years of experience working as an e-commerce/digital marketer using machine learning technologies
- Data savvy, able to draw insights and formulate recommendations from data and understand how to build bridges between data and vision
- Up-to-date with the latest trends in digital marketing in the German market
- Familiarity with SaaS
- Native or near-native German proficiency
- Good English reading comprehension skills
- Collaborative and strong communication skills
- Creative problem solver
Employment details
- Engagement Type: Independent contractor on a project basis
- Work Location: Remote
Recruitment Process
Application submission → Screening → On-the-job testing phase (3 weeks) → Review → Pilot phase (3 months)
About Culture Flipper
Your Art
Your Words
Your Audience
Our Mission is to put your work on the map. We put your products and services (‘your art') in your words for your audience based on a precise understanding of your work in its cultural context.
Who We Are
Culture Flipper is an unconventional multilingual communicator.
We offer contextual communication to make your work click with your audience in the global market, from multilingual copywriting and product localization to original naming and pitch decks.
As a team of digital nomads, Culture Flipper creates and re-creates content that blends seamlessly into erse regions and markets. We are located around the globe, primarily in Anglo America, Latin America, East Asia, Southeast Asia and Europe. The team includes language experts, subject-matter experts, copywriters, editors, typographers, graphic designers and more. Our 5+ step collaborative content creation process goes beyond transcreation. Names & titles, product UIs, catalogs, campaign slogans, promotional video and marketing copy, press releases, manuals and tutorial subtitles become natural and memorable in their target languages thanks to Culture Flipper's deep understanding of the subject-matter, trends and cultural context in both the source and target countries. Culture Flipper is a California corporation established in January 2017. We are a erse and inclusive multicultural agency.

We are looking for a talented Data Engineer Manager to help us change the world of digital advertising together.
WHO WE ARE
At Seedtag our goal is to lead the change in the advertising industry, because we believe that effective advertising should not be at odds with users' privacy.
By combining Natural Language Processing and Computer Vision our proprietary, Machine Learning-based technology provides a human-like understanding of the content of the web that finds the best context for each ad while providing unparalleled risk-mitigation capabilities that protect advertisers from showing their ads on pages that could be damaging for their brand. All of this, without relying on cookies or any other tracking mechanisms.
Every day, our teams develop new services that reach over 200 million users worldwide with fast response times to ensure that we deliver the best user experience. We're fully committed to the DevOps culture, where we provide the platform that our Software Developers and Data Scientists use to manage over 100 different microservices, pushing dozens of changes to production every day. All of this is built on top of Kubernetes in Google Cloud Platform and Amazon Web Services.
If you are interested in joining one of the fastest growing startups in Europe and work on massive scalability challenges, this is the place for you.
KEY FIGURES
2014 · Founded by two ex-Googlers
2018 · 16M total turnover & Internationalization & Getting growth
2021 · Fundraising round of 40M€ & +10 countries & +250 Seedtaggers
2022 · Fundraising round of 250M€ + expansion into the U.S market + 400 Seedtagers
ABOUT YOU
Your key responsibilities will be:
- To keep the relationship with the team members via 1:1, and performance reviews. Career planification, doubt resolution, conflict management, etc.
- To recruit data engineer expert, both for direct report and for side teams
- To have a perfect vision on the data lake structure at use company-wide.
- To ensure Strategic decision-making: helping the team and stakeholders towards a perfect usage of our data-lake/data-mesh services and connected pipelines..
- To be in touch with pair Data-Managers and Product-Managers to align Data Engineering priorities and design new features.
- To support the data-vision alignment within the company, cross-relation with other data teams, communication of the progress of the team upon its products.
- To detect and remove team's blockers. Those can be technical blocks (interaction with systems), or management blocks/pain points (lack of control points with other data teams).
- To develop and deploy production-oriented data software.
We're looking for someone who:
- Have at least 4 years of solid experience in data Engineering and at least 2 years in a management role
- Have a degree in computer science, engineering, statistics, mathematics, physics or another degree with a strong quantitative component
- Have ample experience with Data Engineering tools such as Apache Beam, Spark, Flink or Kafka.
- Is comfortable with object-oriented languages, such as Python or Scala, and you are fluent in working with a Linux terminal and writing basic bash scripts.
- have experience orchestrating ETL processes using systems such as Apache Airflow, and managing databases like SQL, Hive or MongoDB.
- Have experience building data-lake and data-mesh infrastructures as a service from scratch
- Know how to scale-up teams and manage projects with different stakeholders.
WHAT WE OFFER
- Key moment to join Seedtag in terms of growth and opportunities
- High-performance tier salary bands excellent compensation
- One Seedtag: Work for a month from any of our open offices with travel and stay paid if you're a top performer (think of Brazil, Mexico...,
- ⛰ An unlimited remote working environment, where you can choose to work from home indefinitely or attend our Madrid headquarters whenever you want, where you will find a great workplace location with food, snacks, great coffee, and much more.
- Paid travels to our HQ in Madrid to work p2p with your squad members
- Macbook Pro M1
- ⌛ Flexible schedule to balance work and personal life
- Build your home office with a budget of up to 1K€ (external screen, chair, table...)
- A harassment-free, supportive and safe environment to ensure the healthiest and friendliest professional experience fostering ersity at all levels.
- Optional company-paid English and/or Spanish courses.
- Access to learning opportunities (learning & development budget)
- We love what we do, but we also love having fun. We have many team activities you can join and enjoy with your colleagues! A Yearly offsite with all the company, team offsites, and Christmas events...
- Access to a flexible benefits plan with restaurant, transportation, and kindergarten tickets and discounts on medical insurance
Are you ready to join the Seedtag adventure? Then send us your CV!


data🇺🇸usa only
About Retain:
Retain is the world's first customer insights engine. By sitting naturally in a team’s workflow, Retain enables companies to be strategically customer obsessed. We deliver a system that enables everyone responsible for engaging customers to make data-driven decisions and give a complete view across the entire customer lifecycle.
Retain powers enterprise companies. Our leadership team is composed of experienced founders from LiveRamp ($15M funding to $5B NYSE: RAMP), and early employees from Uber (NYSE: UBER), PagerDuty (NYSE: PD), C3 AI (NYSE: AI), and Drawbridge (acquired by LinkedIn).
We recently announced a $23M Series A with backing from Emergence Capital, Baseline Ventures, Upside Partnership, Afore Capital, and Industry Ventures. Gordon Ritter (an early investor in Zoom, Box, Veeva, and Salesforce) and Steve Anderson (first investor in Instagram, SoFi, PagerDuty, Heroku, StitchFix, and many others) sit on our board.
We are a team of friendly, talented people headquartered in San Francisco, but we operate as a remote-first company. You can learn more about our team and values here: https://www.retain.ai/careers
About the Role:
As an early member of the Retain data science team, you will work with some of the largest enterprises in operationalizing predictive and prescriptive use cases that help customer facing teams increase customer value. You will work directly with data scientists, software engineers, and subject matter experts in the definition of new analytics capabilities that are able to provide our customers with the information they need to be more productive and effective in their customer facing operations. You will help find the appropriate machine learning / data mining algorithms to answer those questions and implement those within the Retain product so they can run at scale.
Retain’s product offering is meant for all users - executives, managers, and iniduals - and can be used across functions - sales, customer service, HR, engineering, and more. As a data science lead, you will be helping drive the customer value with thought leadership and scaled solutions to solve problems ranging from helping executives prioritize investment budgets of 100s of millions of dollars to enabling iniduals to be more efficient and drive work-life balance.
Qualified candidates will have an in-depth knowledge of most common machine learning techniques and their application. You will also understand the limitations of these algorithms and how to tweak them or derive from them to achieve similar results at large-scale.
< class="h3">What We’re Looking For:
Qualifications:
-
2+ years of professional experience applying ML, ideally in a customer facing role.
-
Deep understanding of common ML techniques (regression and classification, timeseries analysis, supervised & unsupervised learning, and deep learning).
-
Strong mathematical background (linear algebra, calculus, statistics, optimization).
-
Strong coding skills in Python (or similar object-oriented programming language).
-
Ability to drive a project and work both independently and in a team.
-
Smart, motivated, can do attitude, and seeks to make a difference.
-
Curiosity and willingness to learn about our customers.
-
Excellent verbal and written communication in English.
Preferred:
-
Knowledge of git and experience with JavaScript, Java or Scala is a plus.
-
Experience with scalable ML (distributed systems, MapReduce, streaming).
-
A portfolio of projects (GitHub, papers, etc.) is a plus.
Responsibilities:
-
Provide thought leadership and guidance to internal and external stakeholders across a wide range of use-cases and data science techniques.
-
Research, design, implement, and deploy Machine Learning algorithms for Retain’s product offering that can be delivered through a SaaS offering to customers..
-
Assist and enable Retain customer teams and customers themselves get value out of Retain’s product and data using AI/ML
-
Lead a small technical team on a customer initiative, identify use cases, analyze data, build hypotheses, prototype models, and deliver product features and presentations on R&D efforts.
-
Contribute to the design and implementation of new product features.
-
Supporting hiring and training a high performance data science team.
You might be a great match if:
-
Business sense - has a strong business sense and can decipher priorities and make sound judgment calls when needed.
-
Habit of excellence - perform duties at the highest level possible on a consistent basis and have shown career and personal growth due to it.
-
Team player - have team-oriented experience and approach.
-
Ability to think outside of the box with a sense of urgency.****
-
Entrepreneurial mentality. Think company first, then team, then self
Education: MS or PhD in Computer Science, Artificial Intelligence, Optimization, Econometrics, Operational Research, or equivalent.

< class="h3">Company Description

John Snow Labs is an award-winning AI and NLP company, accelerating progress in data science by providing state-of-the-art software, data, and models. Founded in 2015, it helps healthcare and life science companies build, deploy, and operate AI products and services. John Snow Labs is the winner of the 2018 AI Solution Provider of the Year Award, the 2019 AI Platform of the Year Award, the 2019 International Data Science Foundation Technology award, and the 2020 AI Excellence Award.
John Snow Labs is the developer of Spark NLP - the world’s most widely used NLP library in the enterprise - and is the world’s leading provider of state-of-the-art clinical NLP software, powering some of the world’s largest healthcare & pharma companies. John Snow Labs is a global team of specialists, of which 33% hold a Ph.D. or M.D. and 75% hold at least a Master’s degree in disciplines covering data science, medicine, software engineering, pharmacy, DevOps and SecOps.
< class="h3">Job DescriptionWe are looking for a superstar data scientist, who is familiar and experienced with applying machine learning and deep learning in the area of healthcare. This role requires proven hands-on experience training and optimizing models, building production-ready inference pipelines in Python, performing exploratory data analysis & enrichment, and validating models for issues like bias, overfitting, and concept drift.
Since we primarily work in healthcare and life science, background in medicine, pharma, bioinformatics, or biostatistics is highly beneficial. A PhD degree in a relevant field is preferred.
< class="h3">QualificationsThe primary responsibilities will be working as part of a team in customer-facing projects - building models and machine learning, deep learning, natural language processing, and time series forecasting pipelines that address specific business needs. Working knowledge of Python and TensorFlow are a must; experience with Spark, Spark NLP, and other technology stacks is a big plus. The customer facing aspect of this role also requires strong oral & written communication skills.
We are looking for experts who are looking for long-term freelancing contracts, and wish to work on cutting-edge problems, learn and grow. This role for iniduals who can commit at least 30 hours per week to this project. We are not able to consider agency or team applications.This is a career opportunity that will enable you to expand your knowledge and experience of different tools and techniques, work within a team of big data and data science experts, and make a positive impact with your work. If you quality and are interested, please include the words 'John Snow Labs' in your cover letter and explain why you are the best fit for this role. < class="h3">Additional Information- We are a fully virtual company, collaborating across 26 countries.
- Open to candidates worldwide - work remotely from anywhere.
- This is a contract opportunity, not a full-time employment role.
- This role requires the availability of at least 30 hours per week.

Summary
The Wikimedia Foundation is looking for a Database Administrator to join our SRE team to build, optimize and support the platform serving the world’s favorite encyclopædia to millions of people around the globe. Wikipedia and its sister projects are a globally distributed architecture powered strictly by Free and Open Source software. The stack has MediaWiki at its core and MariaDB as its data store, surrounded by an ecosystem of microservices in PHP, NodeJS, Python, Go and Java.
The Wikimedia community continuously generates text content in the form of new articles, revisions and discussions. As part of our team you would have the opportunity to solve the storage, scaling, access and backup challenges that Wikipedias’ scale provides.
We currently use self-hosted MariaDB as our main RDBMS. Our goal is configuring and managing the database infrastructure in a way that maximizes the service uptime for Wikimedia projects’ users while at the same time simplifying the operational processes, automating repeatable tasks and applying industry best practices.
We are a globally distributed and erse team of engineers with a drive to explore, experiment and embrace new technologies. If you find what we do interesting, if you are up to the challenge of improving the reliability and delivery of one of the Internet’s top 10 websites, and you enjoy the idea of working in a remote-first role, we may be the right place for you.
You are responsible for:
- Operate and improve the RDBMS systems serving Wikimedia projects
- Respond to common database issues, following documented procedures
- Collaborate with SRE and other teams across the organization to ensure their specific database needs are met on an ongoing basis
- Perform day-to-day operational/DevOps tasks on Wikimedia’s public facing infrastructure (deployment, maintenance, configuration, troubleshooting)
- Implement and utilize configuration management and deployment tools
- Monitor systems, services and service clusters, optimize performance and resource utilization
- Participate in incident response, diagnosis and follow-up on system outages or alerts across Wikimedia’s production infrastructure
- Share our values and work in accordance with them
Skills and Experience:
- 2+ years experience in a DBA role as part of a team
- 3+ years experience with Linux in a production environment
- Knowledge and experience of (Free and Open Source) RDBMS (preferably MariaDB or MySQL) administration (understanding of and experience with replication is essential) and support, debugging query performance and schema design
- Comfortable with shell and at least one of the scripting languages used in an SRE/Operations engineering context (Python, Go, Bash, Ruby, etc.)
- Passion for automating tasks and processes
- Strong English language skills and ability to work independently, as an effective part of a globally distributed team
- B.S. or M.S. in Computer Science or equivalent work experience
Additionally, we’d love it if you have:
- Experience working with or as a part of SRE/Operations/DevOps teams
- Experience with NoSQL products (Cassandra, Redis, Elasticsearch, etc.)
- Experience with LAMP stack technologies (PHP/HHVM, memcached/redis, MySQL) - MediaWiki familiarity would be great!
- Developing Free and Open Source software, or being part of an open-source community
The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and the other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive.
The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive donations from millions of iniduals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.
As an equal opportunity employer, the Wikimedia Foundation values having a erse workforce and continuously strives to maintain an inclusive and equitable workplace. We encourage people with a erse range of backgrounds to apply. We do not discriminate against any person based upon their race, traits historically associated with race, religion, color, national origin, sex, pregnancy or related medical conditions, parental status, sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an inidual with a disability, genetic information, or any other legally protected characteristics.
If you are a qualified applicant requiring assistance or an accommodation to complete any step of the application process due to a disability, you may contact us at [email protected] or +1 (415) 839-6885.
More information
Wikimedia FoundationApplicant Privacy PolicyNews from across the Wikimedia movementBlogWikimedia 2030Our Commitment to EquityThis is Wikimedia Foundation Facts MatterOur ProjectsOur Tech Stack
Senior Associate, Data Science
Location: One of our offices or 100% remote
Bully Pulpit Interactive is an outcomes agency made up of strategists, data scientists, and artists. We come from politics, brands, and government and we communicate without barriers and measure what matters. BPI specializes in public affairs, corporate reputation, and social impact. A growing firm, we sit on both coasts and connect Silicon Valley technology, Madison Avenue creative, Washington policy and real people.
The Impact You Will Make
You will bring a broad range of data analysis, visualization and critical thinking skills to use data to identify the right audience to target and measure how effectively our campaigns persuade them. In this role, you will analyze measurement projects and expand the scope of the communications and advertising questions that our team regularly answers. We’re on the leading (occasionally bleeding) edge of ad technology, and the learning curve is quick and steep. We need someone who is ready to solve the problems of today and help us foresee the challenges of tomorrow – all while applying a data-driven mindset to what we do for our clients.
What the Day-to-Day Looks Like
Overview. Own targeting and measurement projects across multiple clients, working on audience identification, experimental design, analysis, and reporting.
Data Science and Analysis. Run sophisticated analyses on survey, media, and audience data and then distill the findings into program recommendations. Streamline and improve analysis processes, R packages, and more widely used tools.
Team Management Train and mentor team members to expand the agency’s advanced analysis capacity.
What You Bring
- Minimum 4+ years of experience working with data and analytics to solve problems in the real world.
- Strong proficiency using SQL to clean and access data. You know how to use a data warehouse to take a dataset, clean things up, and get the right data to solve the problem at hand.
- Experience using R or Python to analyze data. You can confidently write scripts that take a raw dataset, classify it as needed, and then analyze it with tools like regressions.
- Track record of taking complex analyses and communicating them in writing, in decks, and in conversations to a wide range of audiences and at a variety of technical knowledge levels.
- Knowledge of advanced data science and statistical concepts and experience applying them. You have opinions about how to think about the uncertainty in an analysis and you’re ready to explain that to people who don’t have statistics training.
- Team player attitude. You’re excited to collaborate with other experts in a fast-paced environment.
- Ability to adapt to changes, delays, or unexpected shifts in priorities while maintaining focus and productivity in a high-volume environment.
- Ability to prioritize and complete daily workload and projects with minimal supervision and in accordance with deadlines and shifting priorities.
Benefits
BPI offers a wide range of benefits to U.S.-based employees, including 100% company-paid medical, dental, and vision insurance, a 401(k) program with employer match up to 4%, generous time off for vacation, $50 monthly phone reimbursement, unlimited sick time, and 12 weeks of paid leave for new parents. BPI’s benefits prioritize employee wellness and progressive support to our erse workforce.
BPI offers a comprehensive total compensation package for this role to include but not limited to a targeted base salary range of $80,000-$100,000, an annual discretionary bonus, and generous benefits. Actual base and bonus compensation will be determined by a wide variety of factors, such as skillset, level of experience, and other qualifications.
Bully Pulpit Interactive provides equal employment opportunities to all qualified applicants for employment regardless of gender identity, race, color, religion, ancestry, national origin, age, disability, marital status, family responsibilities, sexual orientation, veteran status, citizenship status, matriculation, personal appearance, genetic information or other protected group status.Let’s keep in touch. Follow us on LinkedIn and Twitter at @bpimedia.com

Prominence is looking for a Data Engineer to assist with dimensional data modeling development for healthcare data architecture project.
< class="h3">< class="h3">Who We AreProminence is a healthcare technology strategy and implementation firm, focused on helping the nation’s leading healthcare organizations to do more with their data. Founded by former Epic managers, we understand the technology landscape in healthcare and provide IT staffing, advisory services, and analytics solutions to create robust data ecosystems that support clinical workflows, automate operational processes, and expedite research. Whether it’s guiding a technology implementation, establishing governance principles, or developing leading edge analytics, we help our customers make sense out of the mountain of data at their fingertips in order to deliver higher quality care at a lower cost.
Ranked as a best place to work over 27 times (and counting!), Prominence’s culture provides consultants with a supportive environment that allows you to innovate and grow your career in healthcare IT. Additional information is available on our website.
< class="h3">< class="h3">Your RoleOur consultants guide our customers through complex technology requirements to summit the challenge at hand. You will need to be able to create order out of chaos, and compile ambiguous information into tactical action plans.
Our ideal team members are humble, smart, and driven to ensure our customer’s success. This includes a passion to deliver high-quality results, while teaching our counterparts how to fish and grow the skills needed to support and expand upon the deliverables of our projects.
If this sounds like you, and you meet the requirements below, we encourage you to apply. If you know of someone else how would be a great fit, let us know!
Requirements
As a member of our Epic Consulting team, you’ll work closely with our customers to implement and optimize their Epic workflows. In addition to your Epic project work, you will help mentor and grow our customer’s teams, escalate issues, and guide projects to a successful outcome.
< class="h3">< class="h3">Key Responsibilities Perform Epic-related consulting and advisory services, including but not limited to the following:- Apply technical expertise to implement and optimize EMR workflows and data capture.
- Mentor customers to up-level their system knowledge and analyst skills.
- Analyze operational and business requirements, and translate into system configuration.
- Create build documentation and workflow diagrams.
- Track and resolve project risks and issues.
- Lead meetings and participate in ongoing work-product coordination.
- Transparently report on project status and deliverables.
- Develop robust knowledge transfer documentation to hand-off deliverables to customer teams.
- Additional duties as may be required to successfully deliver a project.
- May be invited to participate in corporate functions, events, and meetings.
- Active Certification(s): Cogito Data Model (Clinical or Revenue Cycle), Cogito Tools, Caboodle Developer
- 5+ years of experience as an Epic BI Developer or Data Engineer
- Dimensional data modeling experience
- Caboodle development experience preferred
- SSIS, Azure Data Factory, or Data Lake development preferred
- Demonstrated ability to deliver successful projects remotely
- High degree of professionalism; treats others with respect, keeps commitments, builds trust within a team, works with integrity, and upholds organizational values.
- Highly organized; able to manage multi-faceted work streams.
- Self-motivated; able to maintain schedule, meet deadlines, and monitor your personal work product.
- Highly adaptable; able to acclimate quickly to new project assignments and work environments.
- Creative; not paralyzed by problems and able to work collaboratively to find novel solutions.
- Clear communication skills; ability to clearly convey messaging that resonates with your audience, in clear and concise written and verbal communications.
- Can smell smoke and anticipate issues before they arise, ability to escalate effectively.
- Passion to mentor and guide others.
Benefits
Prominence is dedicated to hiring the best and brightest minds in healthcare and maintaining a culture that rewards our employees for following their passion. We are excited to offer the following benefits for this position:
- Competitive Salaried and Hybrid Compensation Plans
- Health Care Plan (Medical, HSAs, Dental & Vision)
- Retirement Plan (401k)
- Life Insurance (Basic, Voluntary & AD&D)
- Dependent & Health Savings Accounts
- Short Term & Long Term Disability
- Paid Time Off (Vacation/Sick & Public Holidays)
- Training & Development Fund
- Technology Stipends (for Qualifying Roles)
- Work From Home
- Charitable Giving to Causes You Believe In
Must be legally authorized to work in the United States without sponsorship.
< class="h3">< class="h3">Commitment to Equal OpportunityThe world’s most talented professionals come from every background. All applicants will be considered for employment without attention to age, race, color, religion, gender identity and/or expression, sexual orientation, national origin, marital status, veteran or disability status, or any other characteristic protected by law. In addition, Prominence will provide reasonable accommodations for qualified iniduals with disabilities.
If you are smart and good at what you do, come as you are. All qualified candidates are encouraged to apply.
< class="h3">< class="h3">Partnership EligibilityOur partnerships are extremely important to us. This online application is not intended for anyone who is currently under a non-compete agreement or has an arrangement that precludes employment at Prominence. We appreciate your help in respecting our partners.
Interested in learning more? Apply below to connect with our Talent team about immediate openings and future consulting projects.

Would you like to be part of a growing national healthcare solutions company? Are you looking to positively affect thousands of lives each day via health benefits? Do you have experience in data analyst and healthcare?
We are hiring for a Data Analyst to join our team.
Who we are
Allied is a national healthcare solutions company that supports healthy workplace cultures.
What we do
We are problem-solvers, innovators, and collaborators. Our purpose is to work with employers to take care of their employees and their families every day – and it all starts with the Allied family.
What’s in it for you?
Allied supports an inclusive culture focused on developing employees to succeed, innovate & impact the community.
Here’s how we do it
Training and Development: Allied offers tailored learning and development curriculums for all employees and a Learning Management Database with thousands of courses for professional and personal development.
Career Mobility: Growth opportunities are endless at Allied. In 2021 alone, one in five employees had a job change. 75% of these job changes were promotions!
Employee Engagement: We pride ourselves on employee engagement! With our recognition program, employees recognize their colleagues monthly or donate to charities with cash rewards. Allied has a dedicated committee planning monthly engagement activities to create endless opportunities to get to know your peers and destress in this new remote world.
Employee Feedback: We regularly survey our employees throughout the year to seek continuous feedback, ideas and suggestions on new initiatives.
Community Outreach: We have dedicated committees focused on fundraising efforts supporting our employees and their families, furthering education goals and providing funds for charitable organizations outside of Allied.
What will you be doing?
The Data Analyst is a goal oriented, team player possessing a combination of technical and data skills used to help build and maintain a variety of reports and other analytical tools. The ideal candidate is skilled in various aspects of data analytics, such as programming, mining, generation, and visualization. As a technical lead, the data analyst will be responsible for resolving issues and ensuring success of projects assigned to their team.
ESSENTIAL FUNCTIONS
- Develop an understanding of the health insurance world in different areas of the business such as claims, eligibility, customer service, and stop loss.
- Perform analyses of varied healthcare data to evaluate programs and product solutions using medical, dental, flex, pharmacy, lab, provider, and utilization data.
- Maintain high standards of data quality and integrity.
- Work closely with stakeholders to create dashboards and reports.
- Design and code efficiently using SQL and other data reporting languages.
- Help create and test SQL statements for data presentation and further automation.
- Set up and monitoring routine maintenance SQL jobs and developing / maintaining an alert strategy(s) for failed jobs and database problems and/or failures.
- Data modeling, including an understanding of all normalized forms, design and development of stored procedures, views and triggers.
- Manage daily / monthly data and reporting processes automated through SQL Server agent
- Create reports using a variety of tools and formats such as Excel, Power BI, PDFs, etc.
- Document methods, specifications, and findings clearly; contribute to the writing and the presentation of results, findings, and conclusions.
- Excellent attention to detail, ability to detect and correct problems within SQL script
- Meet Allied’s expectations for productivity, quality, and goal accomplishment.
- Work closely with manager to complete assigned tasks
- Adhere to, and apply all applicable privacy and security laws, including but not limited to HIPAA, HITECH and any regulations promulgated thereto.
- Perform other duties and responsibilities, as assigned.
- Attend continuing education classes as required, including but not limited to HIPAA training.
- 3-4 years of data analyst experience. Health care experience preferred.
- 2+ years’ experience with MS SQL Server.
- Experience with SQL Server Development – creating and modifying stored procedures.
- Ability to use reporting and business intelligence tools like Microsoft SQL Server Reporting Services (SSRS) or Power BI.
- Ability to collaborate well with team members.
- Experience in working in a data warehouse environment as well as the ability to work with large data sets from multiple data sources.
- Proficient computer skills, including Microsoft Office Suite (Word, Excel, Access and Power Point).
- Demonstrated analytical and problem-solving skills.
- Strong written, verbal communication and analytical skills required.
- Usage of SQL Profiler
- Experience with Power Automate
- Knowledge of integrating SQL Server solutions in .NET environments, with working knowledge of C#, ASP.NET and SOA with web services
- Bachelor’s degree required with a focus in Computer Science, Math, Statistics desirable.
- N/A
- This position requires long periods of sitting and computer work.
EXPERIENCE, SKILLS & ABILITIES
- 3-4 years of data analyst experience. Health care experience preferred.
- 2+ years’ experience with MS SQL Server.
- Experience with SQL Server Development – creating and modifying stored procedures.
- Ability to use reporting and business intelligence tools like Microsoft SQL Server Reporting Services (SSRS) or Power BI.
- Ability to collaborate well with team members.
- Experience in working in a data warehouse environment as well as the ability to work with large data sets from multiple data sources.
- Proficient computer skills, including Microsoft Office Suite (Word, Excel, Access and Power Point).
- Demonstrated analytical and problem-solving skills.
- Strong written, verbal communication and analytical skills required.
- Usage of SQL Profiler
- Experience with Power Automate
- Knowledge of integrating SQL Server solutions in .NET environments, with working knowledge of C#, ASP.NET and SOA with web services
Preferred
- Usage of SQL Profiler
- Experience with Power Automate
- Knowledge of integrating SQL Server solutions in .NET environments, with working knowledge of C#, ASP.NET and SOA with web services
Education
- Bachelor’s degree required with a focus in Computer Science, Math, Statistics desirable.
PHYSICAL DEMANDS
- This position requires long periods of sitting and computer work.
WORK ENVIRONMENT
Remote
Hiring is contingent upon successful completion of our background and drug screening process. Allied is a drug-free and tobacco-free workplace.
Diversity creates a healthier atmosphere: Allied is an Equal Employment Opportunity / Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law.
#LI-MM1 #LI-REMOTE

Carry1st is Africa’s leading publisher of mobile games and digital content. Operating at the intersection of games, fintech, and web3, we scale awesome content in underserved frontier markets. Carry1st has a proprietary payments and ecommerce platform which allows players to acquire global content using locally-accessible payment methods. We are backed by awesome investors like: Andreessen Horowitz, Google, Riot Games, Avenir, Konvoy Ventures, and Nas.
As a Head of Data Analytics you will build the foundation for a world-class analytics team that translates data into actionable insights to drive business performance across gaming, payments, and e-commerce products.
Serving as a Leader in the Analytics organization, you will use your deep industry expertise to develop and govern our data strategy, ensuring we adhere to best in class principles and have confidence in the outcomes of our experiments.
Working closely with our technical and product teams, you will play a crucial role in helping us build better products and acquire new customers effectively as we scale.
You will...
- Effectively lead the analytics team. Hire, train, and empower data analysts and scientists
- Define overall data strategy and roadmap, taking into account erse business goals
- Set technical standards / tools to implement the strategy in a scalable, cost-effective manner
- Develop ways-of-working with product, growth, and engineering to reduce waste / noise
- Operationalize company-wide, team-specific, and product-specific performance metrics via dashboards and automated reporting
Requirements
What makes you a great candidate?
- Experience building and scaling processes from scratch, ideally in a F2P gaming company
- Hands-on experience with data analytics and visualization using large, complex data sets
- Experience setting up and running experiments (A/B, A/A/B, etc) for B2C applications
- Demonstrated ability to recruit and manage high performing, erse teams
- Strong first principles thinking and ability to distill complex systems into component parts
- Experience translating requirements into actionable tasks for technical teams
Benefits
What will it be like to work at Carry1st?
Carry1st is a fast-growing, dynamic place to work. And while we are laser focused on building the flagship mobile internet company in Africa, our team is fully remote and 100% global - with people in 20+ countries! At Carry1st, you will…
- Work with one of the largest consumer data sets in Africa
- Grow with a VC-backed startup at the intersection of gaming, fintech, and web3
- Work from anywhere in the world and with high performing, international teammates
- Own shares in the Company - enabling you to benefit from the value you create every day
Some additional perks…
- Co-working excursions: Travel to meet your colleagues in cities around the world
- Awesome equipment: Get everything you need to work effectively
- Remote working allowance: Put an additional $600 / year to optimize your WFH experience
- Learning and development: Attend courses, conferences and training events
- Social events: Participate in regular company events to relax and connect with teammates
- Birthday leave: Enjoy a paid day off on your special day
We hire great people from a wide variety of backgrounds, not just because it's the right thing to do, but because it makes our company stronger. Join us!
Learn more about Carry1st…
- Andreessen Horowitz shared why they invested in Carry1st
- Bloomberg announced that Nas and Google joined Carry1st’s $20M round
- Quartz reported on Carry1st and the potential of the African gaming market


data🇺🇸usa only
Why you should join our team
VELCO's Data Architect is passionate about building collaborative and integrated environments. VELCO is looking for a Data Architect motivated by creativity and focused on customer service to join our Applications Team within the Information Technology department. Data Architect will provide leadership and support to establish, build, and implement the VELCO Enterprise Data Management (EDM) strategic plan
How you will make an impact
• Utilize data management best practices to provide actionable information and establish VELCO corporate system of record
• Integrate internal and external data sources using the ETL process within the EDM system
• Outline data management governing standards and principles
• Implement a data catalog of enterprise data assets and characteristics
• Evaluate and implement the use of data management and modeling tools and technologies
• Develop a reference architecture that describes and defines data specification
• Define how data flows from origination to destination points
• Define, manage, and control data integration techniques and processes
• Ensure compliance with data quality, data assurance, data security, as outlined by VELCO security and compliance policies
• Perform related database operations and management tasks as required to support database administration
Who you are
A Bachelor's degree in Computer Science, Data Science, or related field. Equivalent work experience is also considered. A Master's degree may be substituted for some experience. Advanced data management training highly desirable.
The Data Architect will typically have at least 5 years of related experience
• Data management experience including Data Catalog, Data Governance, Data Quality for EDM
• Enterprise database management using SQL Server Management Studio
• Experience with SQL, Python, R, HTML, or JavaScript
• Database programming experience – stored procedures, views, triggers, etc.
• Strong analytical and problem-solving skills
• Certification or training for two of following: FME, SSIS, SSAS, PowerBI, ETL, Collibra, Informatica
• Prior experience employed as any of following - DBA, Data Engineer, Data Architect, Data Mgr.
• Real time asset monitoring using software such as PI from AVEVA (Desirable)
VELCO's commitment to our employees
• Paid Time Off – Everyone needs time off to recharge. New employees are frontloaded with prorated time off dependent upon date of hire
• Sick Leave – We advocate for staying home when you are sick
• Paid Holidays – We observe 11 paid company holidays
• Medical – Select from two comprehensive health plans. Employees have access to a Health Savings Account and Flexible Spending Account. We allow documented Domestic Partners on our plans
• Dental & Vision – Offers a wide range of covered services
• Wellness Program – Allows you to receive up to $100 per year for reimbursement of approved wellness expenses
• Employee Assistance Program (EAP) – Aids with mental health, stress management, work-life balance, financial and legal matters, and much more
• 401K – We offer a very generous employer match and profit share contribution
• Short Term Incentive Bonus – VELCO proudly offers a yearly incentive bonus based on inidual contributions and key performance indicators
• Life Insurance – VELCO paid life insurance and supplemental coverages are available
Important Considerations
• Starting pay will be determined at the time of offer based on the experience, education, and training of the successful candidate.
• Eligible applicants must be authorized to work in the United States.
• Any offer of employment will be contingent upon successful reference check, background check, physical examination, drug screening, and documentation of Covid-19 vaccination
• Remote, hybrid and on-site work arrangements available
• VELCO is an Equal Employment Opportunity & Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran
The Fine Print
Prolonged periods of working on a computer. Must be able to maintain fast pace while completing complex, analytical work, in potentially stressful situations, with competing priorities, within tight timelines and with frequent interruptions. Must be able to work additional hours when necessary. Periodic travel and overnight stays will be required when attending industry workshops, trainings, etc.
Who VELCO is
VELCO sits at an inflection point of dynamic innovation in an industry undergoing transformation. Our core responsibility to operate a safe and reliable electric transmission system for Vermonters has not changed. How we fulfil that responsibility, however, changes regularly with the evolution of the system we operate and the dynamic demands of the customers we serve. These conditions require that VELCO remain unshakably focused on operational excellence today while anticipating the capabilities we will need to thrive tomorrow. Likewise, the successful Data Architect will have the capabilities to meet today's responsibilities while having the foresight and learning orientation to anticipate VELCO's needs well into the future.
The Data Architect at VELCO is a trusted partner excited to help us realize our Vision to create a sustainable Vermont through our people, assets, relationships and operating model. The Data Architect is someone innately committed to fulfilling our Mission to serve as a Trusted Partner by passionately living our TRUST values (Treating each other with respect; Responding with urgency; Unconditionally supporting one another; Sharing information; and Thinking outside the box).

About Pensa Systems
Pensa is a group of innovators driven to help CPG (Consumer Packaged Goods) brands and retailers grow by providing them the source of truth for the retail shelf. Our computer vision and patented artificial intelligence and machine learning is the world's first fully automatic capture-to-insight retail shelf intelligence solution that learns as it goes. We provide our syndicated SaaS solution to top global CPG brands and retailers such as Johnson & Johnson, General Mills, Anheuser-Busch InBev, CircleK, and Unilever to address a trillion-dollar industry blind spot by digitizing physical inventory and bringing it online. Although we work hard and are accountable to our customers and each other, we don't get too wrapped up in protocol or bureaucracy. We are a erse group whose collaboration and open communication helps us all do our best work and succeed as a team. If you are looking to grow personally and professionally while making a difference with something new and innovative, Pensa may be the right place for you.
If you would like to learn more about us, please visit pensasystems.com.
Job Description
Our software team is responsible for all of the functionality associated with acquiring, processing, and reporting the state of retail shelves. This spans mobile applications, machine learning, web applications, and data analytics. We work at large scale deployments over multiple cloud vendors with a system that has high uptime and throughput requirements.
The Data Platform Lead is a key member of the engineering team who sits at the intersection of product development and the overall technology vision. They are responsible for providing leadership from an enterprise-wide perspective in the definition, design, review, selection, communication, and implementation of our data architectural direction to most effectively realize our value, vision and strategies.
As the Data Platform lead, you will define strategy and drive the Data Platform roadmap, help build it out, and work with a team of engineers to deliver it.
In this role, you will
- Lead critical data platform architecture decisions across the whole Pensa Data Platform
- Lead the vision and design of architectural roadmaps and data technology architectures
- Provide expertise on the overall data engineering best practices, standards, architectural approaches and complex technical resolutions
- Strategize and lead delivery of complex client engagements by designing Pensa Data Delivery solutions:
- Understand client goals and objectives
- Assess their current state and identify gaps
- Develop high-priority use cases
We'd love to hear from you if you have
- 10+ years of experience in a Data Platform Architect or Solution Architect role in Data Engineering/Analytics-related businesses with various infrastructure technologies.
- Experience with large scale systems
- Experience with technologies like Python, Airflow, distributed and streaming technologies (Kafka, Kinesis), ETL tools, CI/CD, Terraform, data modeling for transactional and reporting focuses, and Data Warehousing solutions.
- Experience building data-intensive applications in at least one cloud environment
- Ability to define, identify and resolve problems with varying degrees of complexity.


bosnia and herzdata
Passionate about Microsoft technologies? Looking for your next big challenge? You've come to the right place. We are currently on the lookout for a Database Developer to join our growing team and work with us on an interesting large-scale FinTech project. The fun side: the projects is fully immersed in a wide variety of Microsoft technologies (.NET, Azure, MSSQL, Data Factory, Dynamics...)
Our projects are quite challenging but don't worry - you will be working with a team of passionate, friendly and inquisitive self-starters, who like to follow new technology trends and constantly improve themselves as well as the projects they're working on.
Your role and responsibilities
As a Database developer you would be working on a FinTech project that utilises virtual payments, cards and wallets for real time funding while tracking and analysing those payments for a behavior driven brand building system. You would be responsible for developing, testing, improving and maintaining new and existing databases to help its users retrieve data effectively. To be more precise, you will be:
- Designing, creating and supporting stable, reliable and effective databases that can withstand various attacks and loss of information
- Modifying various databases according to user requests
- Involved in schema design, code reviews and SQL query tuning
- Using T-SQL to develop stored procedures, functions, triggers and views
- Working together with developers to improve and optimize the performance of different applications
- Installing, tuning, maintaining and upgrading different DBMS solutions
- Research, suggest and own the process of implementing new solutions and technologies
- Writing technical documentation and provide occasional support to different teams
- Testing, troubleshooting and solving occasional database issues and malfunctions
- Creating reports per user request
- Providing data management support to different teams
- Working with distributed teams in an Agile oriented environment
About you
You're someone with 2+ years of experience working with databases and were responsible for database development and management/administration on your previous projects. You are proficient in everything data related, from inception to execution, maintenance and upgrade. Tackling both relational (MySQL, SQL server, PostgreSQL...) and non-relational databases (MongoDB, Redis, Cassandra, HBase) is an average day to you. Someone said queries? Well, that's not an issue for you since you can write an SQL query to fetch you almost anything. Basics of SQL Server administration is also nothing new to you (users, permission, backup, recovery, monitoring...). And since you're analytical you know your way around creating different reports (SSRS, SSAS). Sounds like you? Great, head on to the next section.
What next?
If you're ready to be a part of a team that works together to achieve both technical and personal greatness be sure to hit apply.
We will carefully select all the candidates for the next steps. For a detailed info on our hiring process and what to expect, be sure to check out our Careers page.
Questions?
Not sure if you're the right person for this? You need more info about the project or us? Don't worry, I'm here for you :) Be sure do drop me a message whichever way you like:
- E-mail: lejla.musovic @ klika.ba
- Phone: +387 61 907 780
- Viber: +387 61 907 780
- LinkedIn: https://www.linkedin.com/in/le...

OTA Insight is a commercial platform for the hospitality industry (our clients are AirBnb property owners, inidual hotels and major chains like Hilton, Accor, etc.). These are exciting times at OTA Insight: we recently raised another round of funding ($80 million) and we have welcomed two new companies, Transparent and Kriya, to our organisation that will expand our product portfolio! To accomplish our ambitious plans we are growing our Engineering team.
As a Data Science Team Lead, you will be a leader & mentor to a team of Data Scientists with a full range of seniority. You will be responsible for stakeholders management and driving the roadmap of Data Science research related to existing product solutions.
In this role you will be partnering with multiple different stakeholders, such as our product & design teams, and our truly inspiring team of technical experts on various subjects (DevOps, data engineering, data science, fullstack, security, …). Additionally, you will be responsible for the happiness and growth of Data Scientists in your team.
This is a high-impact role within OTA Insight which combines people and technology leadership.
Responsibilities
- Drive product innovation (in partnership with Product Management):
-
Deeply understand our product and business strategy and client needs;
-
Actively participate in the product ideation and be a critical voice in the process;
-
Own a backlog of product related data research initiatives.
-
- Be the bridge between business stakeholders and your team:
-
Coordinate between different stakeholders;
-
Gather requirements and translate them into clearly defined tasks;
-
Document & verify high-level architectural decisions related to Data Science;
-
Partner with product & engineering leadership to define strategy and vision on technology, product or team-related topics and take responsibility in defining how that impacts your team.
-
- Be a great leader:
-
Plan and prioritise work within your domain roadmap, absorbing planning complexity for team members;
-
Be a sounding board to the team and share the knowledge and expertise;
-
Review Data Science projects and take action where necessary;
-
Support team members in their personal and professional growth;
-
Make your team a better place and feel part of OTA Insight’s mission.
-
- Grow a great team:
-
Hire, develop and coach an exceptional team of Data Scientists;
-
Ensure scalability of processes and best practices in your team.
-
- 5 years working in a Data Science role:
-
Understanding the steps required in cleaning and transforming datasets;
-
Strong expertise in statistics;
-
Knowledge of standard ML techniques;
-
Ability to derive and clearly communicate insights from a dataset;
-
Experience with a modern Data Science tech stack (jupyter lab, cloud infrastructure, etc.).
-
- Proven record of management skills:
-
Comfortable delegating work through a team of both juniors and seniors;
-
Ability to coach a team, both the juniors and seniors;
-
Actively addresses problems when identified and assumes responsibility for work;
-
Able to lead, prioritise, delegate and inspire in an ambiguous, highly dynamic, and highly demanding technology environment.
-
-
Willing to acquire deep knowledge of how revenue management and distribution works within the hotel industry.
-
Pragmatic in problem solving and decision making.
-
Is a clear communicator in decisions and the rationale behind them.
-
Preferably, has successfully participated in product ideation processes.

Hiring a Senior Data Engineer as we scale our team at Ready and Broadband Money:
Ready is working to level the playing field on this century’s most important utility - making sure everyone has access to fast and reliable internet. And we’re taking a novel approach that’s picking up tailwinds. We need your expertise in database management, spatial-temporal data ETL process, and build models as we expand the customer base, volume of data, and complexity of our platform.
This is a golden opportunity to join a fast-moving tech company in an industry poised to receive billions of dollars in federal funding – imminently. This means more customers are continuously seeking more from our mapping products. As a Data Engineer at Ready, there’s an opportunity to play a meaningful role in what we’re building.
A bit about you 🥇
-
5+ years related experience (Data Engineer) (Experience in spatial-temporal database is a plus)
-
Experience in data pipeline, big data architecture, data warehouse, ETL and modern ELT process and automations, and data governance
-
Solid experience in frontend OR backend development (Full stack is a plus)
-
Experience in data preparation, data analysis and building prediction models with open source tools: R, Python, etc.
-
You’re organized, thoughtful, creative, and rigorous
-
You’re humble, honest, and scrappy
-
Able to work efficiently and independently; proactive communicator in a distributed and often asynchronous environment
-
You enjoy building lean, inclusive, and performant teams and mentoring junior engineers
About your role at Ready ⚡️
-
Design, develop, and implement data pipeline to process incoming data in a scalable manner
-
Manage and maintain existing database
-
Automate existing processes, such as data backups, security checks, alerts and disaster recovery, to streamline the workflow
-
Develop API for data administration and serving data to other parts of the product
-
Work with customers to understand their problems. Work with your teammates to devise solutions to those problems.
-
You’ll have a major impact on your team and company here at Ready.
-
This role could become a leadership position within Ready as we continue growing together.
About Ready 🚀
-
Creative problem solvers approaching an antiquated system with a revolutionary viewpoint
-
Humble but ambitious, knowledgeable but curious, persistent but not obnoxious
-
Concise and effective in written and spoken communication
-
Comfortable working remotely
About what you get…
-
Competitive salary plus meaningful equity upside
-
Competitive (and ever expanding) benefits for employees and dependents
-
Opportunities to learn and grow – all things startups
-
A chance to play a role in defining the roadmap as we pursue a bold vision and and a big goal
-
Work from anywhere you want, as long as you can get great internet (and your work here at Ready helps make this true in more places).
-
To get away - we all convene 2x / year for [optional] retreats
-
We’re actively shaping our benefits program: have a say in which benefits matter to you
-
The charter to a build product in a market that is set to receive $65 billion in grant funding across the United States


datauk
QS is a global leader in higher education services. We are the data analytics, performance insight provider and intelligence partner supporting university excellence across the world. Our student recruitment and enrolment solutions enable universities and business schools to connect with talented iniduals that are seeking to further their academic progress and career development. Our portfolio of professional services includes consultancy, student mobility and academic partnerships management, and branding solutions.
We publish highly visible and influential rankings of international universities, including the QS World University Rankings® which reaches a global audience of hundreds of millions of people. QS keeps growing organically, through acquisition and by recruiting top talent across key regions.
< class="h3">Position Summary:The Senior Data Analyst role for QS Enrolment Solutions will analyse large volumes of complex data to produce insights that can improve business performance. They will help identify stakeholder requirements and develop insightful dashboards and reports to support business decisions and provide customer insights.
Working collaboratively with global stakeholders the Senior Data Analyst will identify and adopt best practices to solve complex problems in a data- driven manner and use strong analytical skills and business acumen to support leadership in influencing critical business decisions.
The scope of this role will be erse, from ad-hoc pieces of analysis, to ensuring timely regular reporting, enhancing existing reporting pipelines, and working on long-term strategic projects to enhance the BI & Analytics.
This role will be required to collaborate with internal stakeholders to understand their data requirements and analyse outcome, as necessary.
< class="h3">Responsibilities:- Data analysis, using statistical techniques, identify, analyse, interpret trends, patterns in new and existing datasets to present insights.
- Guide and mentor the junior data analysts in the team
- Be responsible defining standardised metrics, methodologies, and models to ensure consistency and accuracy across all analytical and reporting activities
- Make recommendations about the methods and processes in which QSES obtains and analyses data to improve accuracy and the efficiency of data processes
- Collaborate with internal stakeholders to identify, prioritise and implement new data sources and analytical needs of the business.
- Compile reports and provide insights to internal stakeholders to measure and improve business performance.
- Specification and documentation of new reporting requirements.
- Benchmarking and trend analysis against performance targets.
- Write and execute SQL queries to extract and manipulate data for the purpose of analysis and reporting.
- Testing and documentation of data models.
- Check and verify data and analyses to ensure accurate reporting
- Identify and resolve the root cause of data quality issues and develop solutions to improve data quality.
- Support cross-functional teams on day-to-day execution and support ad-hoc requests
- Perform data operations activities such as matching, formatting and mapping data to ensure a global common data set.
- Promote 'Self-serve' analytics by mentoring internal stakeholders in the interpretation and use of analytical reports and models.
- Assist the Director of Business Intelligence with any other duties as required
- BS in Mathematics, Economics, Computer Science, Information Management or Statistics, or a closely related field or equivalent demonstrable experience.
- 5+ Years experience working in Data Analyst or Data Science role.
- Strong analytical skills with the ability to collect, organise, analyse, and disseminate
- significant amounts of information with attention to detail and accuracy.
- 3+ years Experience in formatting, analysing and presenting data in Tableau, PowerBl or other BI reporting applications.
- Experience in mentoring a team of analysts.
- Ability to write complex queries in SQL and model data from data processes.
- Experience in maintaining or conceptualising analytical models using R or Python.
- Technical expertise regarding data models, data mining and segmentation techniques
- Experience of working within a RDMS and Data Warehouse environment.
- Highly developed critical thinking and problem-solving skills with an ability to conceptualise and think creatively.
- Experience of working in a small, high impact teams required to meet tight deadlines.
- Excellent communications skills -- both written and verbal, and the ability elicit reporting an and analytical requirements and to present difficult concepts into understandable terms to all levels of the organisation.
- Able to work autonomously and within a team including managing own workload
- Proačtive attitude with the ability use initiative to identify potential issues and offer creative solutions.
- Knowledge of Agile and Waterfall methodologies
- Excellent time management and organisational skills with and aptitude to work under pressure
- Willingness to offer and discuss opinions whilst considering those of others in the team and the wider business
- Competitive package.
- Flexible working.
- Vibrant social environment and multicultural, multinational culture, strong team spirit.
- Focus on welfare – ride to work scheme, global wellness team, Calm app, EAP and health plan, mental health first aiders, ersity and inclusion initiatives.
- Strong recognition and reward programs – peer recognition platform, quarterly and annual awards, annual bonus scheme.
- Support for volunteering and study leave.
QS Quacquarelli Symonds is proud to be a fair and equal organization where everyone has the same opportunity to achieve their full potential, irrespective of their background or personal attributes. We celebrate our ersity and believe through sharing our experiences we can learn from one another, be stronger together, and enable our business to thrive.

Mind Computing is seeking a Senior Development Data Security Operations (DDSO) SME to join our growing team in support of the Department of Veteran Affairs, specifically the Financial Services Center (FSC). The FSC serves customers internally at the VA and other government agencies (OGA).
The candidate must reside within the continental US.
Responsibilities:
- Serve as an advisor to leadership concerning the development, design, maintenance, and implementation of enterprise-level Cloud systems.
- Produce architecture diagrams, and blueprints, and bring solutions to reality with a DevSecOps mindset & culture.
- Provide technical guidance and foster a collective understanding of data flows and security issues encountered in cloud applications and services.
- Work in Agile methodology and partner with scrum teams for secure application and/or infrastructure solution architecture.
- Deliver Cloud Security Architecture/DevOps on assigned projects using any Cloud Service Provider (CSP).
- Manage the operation and advancement of the CI/CD pipeline.
- Support the government PMO management to lead the DDSO Center of Excellence and meet Center of Excellence requirements.
Requirements:
- Bachelor's Degree in Engineering or Technical related discipline.
- 8 years of experience working functionally across financial services organizations
- Experience and understanding of Infrastructure as Code, Automation, and Orchestration.
- Experience deploying web and service-based applications in Windows/Linux environments.
- 2 or more years of experience doing application and/or infrastructure solution architecture for Azure or AWS equivalent products and services.
- Knowledge and experience across IT infrastructure with security frameworks and standards such as ISO 27001, NIST, and other relevant security-related regulations.
- ISC or CISSP certification.
Additional Qualifications
- Experience in the VA
- Ability to obtain a government clearance.
Benefits:
- Medical/Dental/Vision
- Corporate Laptop
- PTO + Federal Holidays + Sick Leave
- Training opportunities
- Remote work
Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, or protected veteran status.
Sorry, we are unable to offer sponsorship at this time.

We are looking for a talented Senior Data Scientist to join our Data team and help us change the world of digital advertising together.
WHO WE ARE
At Seedtag our goal is to lead the change in the advertising industry, because we believe that effective advertising should not be at odds with users' privacy.
By combining Natural Language Processing and Computer Vision our proprietary, Machine Learning-based technology provides a human-like understanding of the content of the web that finds the best context for each ad while providing unparalleled risk-mitigation capabilities that protect advertisers from showing their ads on pages that could be damaging for their brand. All of this, without relying on cookies or any other tracking mechanisms.
Every day, our teams develop new services that reach over 200 million users worldwide with fast response times to ensure that we deliver the best user experience. We're fully committed to the DevOps culture, where we provide the platform that our Software Developers and Data Scientists use to manage over 100 different microservices, pushing dozens of changes to production every day. All of this is built on top of Kubernetes in Google Cloud Platform and Amazon Web Services.
If you are interested in joining one of the fastest growing startups in Europe and work on massive scalability challenges, this is the place for you.
KEY FIGURES
2014 · Founded by two ex-Googlers
2018 · 16M total turnover & Internationalization & Getting growth
2021 · Fundraising round of 40M€ & +10 countries & +230 Seedtaggers
2022 · Fundraising round of 250M€ + expansion into the U.S market + 400 Seedtagers
YOUR CHALLENGE
- You will work identifying, exploring, and making sense of data sources.
- You will partner closely with the business teams to assess how data can provide value in their units.
- You will develop machine learning solutions to solve business problems: anomaly detection, price optimization, etc.
- You will develop and deploy production-oriented data software.
YOU WILL SUCCEED IN THIS ROLE IF
- You have 2-4 years of solid experience in data science and machine learning
- You (preferentially) have a degree in computer science, engineering, statistics, mathematics, physics or another degree with a strong quantitative component
- You understand the theoretical functioning of the main algorithms in ML and you have experience applying these to real problems
- You have ample experience with one or more of these machine learning tools: Scikit-learn, Tensorflow, PyTorch, etc.
- You are a proactive person who likes the startup work culture
OUR DNA
We are an AdTech family where innovative ideas and new ways to do things are welcome, we reject "that's the way it's always been done". In Seedtag you can find an energetic, fresh workplace, multicultural work environment where our members are from different countries from Europe, LATAM and so many more!, where you will have the chance to impact directly on the company results.
Seedtag DNA is unique from the very beginning, we celebrate and embrace ersity, also we want all our members (They, he or she) to feel like home, all of the human differences are welcome.
SEEDTAGGER'S EXPERIENCE
"Do you want to be involved in the whole process of developing data technology? Then, Seedtag is your place. We are a growing team of techies who are pushing the data to every corner of the company. We are in charge of every aspect of the data pipeline, from the first POCs to the production deployments. If you are a curious inidual who loves playing with data and technology, then don't waste time, apply!" ( Sergio Rozada, Data Scientist at Seedtag)
SEEDTAG BENEFITS
- Key moment to join Seedtag in terms of growth and opportunities
- Career ladder plan for your professional growth
- High-performance tier salary bands excellent compensation
- One Seedtag: Work for a month from any of our open offices with travel and stay paid if you're a top performer (think of Brazil, Mexico..., )
- Paid travels to our HQ in Madrid to work p2p with your squad members
- Macbook Pro M1
- ⌛ Flexible schedule to balance work and personal life
- ⛰ An unlimited remote working environment, where you can choose to work from home indefinitely or attend our Madrid headquarters whenever you want, where you will find a great workplace location with food, snacks, great coffee, and much more.
- Build your home office with a budget of up to 1K€ (external screen, chair, table...)
- A harassment-free, supportive and safe environment to ensure the healthiest and friendliest professional experience fostering ersity at all levels.
- Optional company-paid English and/or Spanish courses.
- Access to learning opportunities (learning & development budget)
- We love what we do, but we also love having fun. We have many team activities you can join and enjoy with your colleagues! A Yearly offsite with all the company, team offsites, and Christmas events...
- Access to a flexible benefits plan with restaurant, transportation, and kindergarten tickets and discounts on medical insurance
Want to be a Seedtagger? Then send us your CV, we are waiting for it!

mod.io is a fast-growing, early-stage startup backed by leading gaming venture capital firms, working to bring user-generated content (UGC) to games and their players by offering a ready-to-go digital logistics solution that includes community tools, a web UI, and a REST API for any developer to integrate into their game.
The mod.io platform supports over 550,000 daily active users, and we are seeking a Data Engineer to help us scale the industry's best solution for exploring and installing UGC that will be embedded in some of the largest games in the world.
The mod.io service is centered around the REST API which allows developers to bring the modding and UGC experience into a game's UI rather than requiring players to run an external tool or integrate mods manually.
Why mod.io
The popularity of in-game content is exploding, it's in more games and on more platforms than ever before. We have experienced up to 20x growth in our key metrics, and have a database with over a billion rows, and plans to grow even larger.
So if you enjoy solving scaling challenges for a company expecting significant growth, and you want to work at a company and in an industry where data and data insights matter, then mod.io is the place for you.
Requirements
We are looking for a savvy Data Engineer to join our growing team. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection across 1,000s of data inputs.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer is also a hands-on software developer, able to write and modify the code to capture relevant data, massage it into meaningful insights and make these insights available via APIs. You will also ensure we have an optimal data delivery architecture that is consistent throughout all our projects.
You must be self-directed and comfortable supporting the data needs of multiple project teams.
You will be excited by the prospect of designing and optimizing our company’s data architecture to support our next generation of growth, products and data initiatives.
Your Responsibilities:
- Understand mod.io’s data needs and objectives
- Provide architectural and implementation leadership for data architectures, data warehouses, data lakes and other cloud-related data initiatives based on mod.io’s needs and objectives
- Build and maintain an optimal data ingestion & ETL solution
- Assemble large, complex data sets that meet functional and non-functional business requirements
- Explore and implement ways to enhance data quality and reliability
- Identify opportunities for (relevant) data acquisition
- Consider legal and regulatory obligations in regard to data capture and usage
- Identify, design, and implement internal process improvements: automate manual processes, optimize data delivery, (re)design infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- Build or implement BI and query/reporting tools that utilize the data pipeline to provide actionable insights for key business performance metrics for presentation internally and externally (eg: customer dashboards etc)
- Work with project teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across international boundaries through multiple data centers and AWS regions.
- Provide data and analytics expertise as we strive for greater functionality in our data systems.
- In relation to our data pipeline architecture:
- Deploy and maintain production environments that require high availability.
- Monitor data servers to proactively identify performance issues, problematic trends and troubleshoot/escalate to resolve as appropriate.
- Drive the product towards higher availability and reliability & assist with on-call support on a rotating schedule for incident escalations (24x7).
- Ensure our data services meet stability, performance, and availability requirements.
- Monitor backups, usage, capacity, and performance of servers; liaise with users and/or vendors to address problems and changes in requirements.
- Build robust, self-healing features and automation that reduce operational effort and improve service up-time.
- Self-starter mindset with a strong drive to learn and own engineering initiatives to promote a culture of continuous improvement, and engineering excellence.
Qualifications
- Extensive experience working with relational databases, query authoring (SQL) and designing database backup and replication approaches
- Familiarity with a variety of database technologies
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Advanced SQL knowledge
- Expert ability to tune databases and optimise query performance
- Extensive experience in our core database technologies: MySQL, Redshift (PostgreSQL) and Memcached
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and MySQL
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Experience in one of the following Scripting languages: Python, PowerShell, Bash, Shell Script.
- Experience with monitoring and logging services (e.g. Elasticsearch, Wavefront, Uptime, Solarwinds, or similar).
It would be awesome if you also have:
- Data engineering certification (e.g AWS Certified Data Engineer)
- Experience with cloud-based infrastructure and services (AWS)
- Experience working in an agile environment
- Experience with Jenkins or similar build automation tools.
- Experience with Machine learning and AI over large data sets
- Experience with Trello and G-Suite
- A passion for video games.
Benefits
- Remote working is actively supported.
- Competitive salary plus equity.
- Remote working is actively supported.
- Flexible working hours and family-friendly considerations.
- Sit-stand desks, 27” monitor, ergonomic chairs.
- Regular social events.
- Experience new games, digital and tabletop.
- Attend international gaming conferences.
- Contributing to open-source on Github

Title: Healthcare Data Entry Specialist
Location: US National – Remote Full-Time
Representing clients on a variety of projects via inbound/outbound telecommunication.
This is your opportunity to join Ashfield, represent a top biotechnology company.
What’s in it for you?
- Temporary Project with opportunity to interview with other teams internally
- Competitive compensation
- Generous performance-driven Incentive Compensation package
- Competitive environment with company wide recognition, contests and coveted awards
Key Objectives:
Maintain excellent quality standards for all client programs; adhere to program guidelines. Accurately transcribe and data enter information required by inidual programs and correctly capture in specific program databases. Adhere to all company policies and Standard Operating Procedures. Display flexibility within department to maximize utilization. Exhibit highly effective transcription and data entry skills meeting or exceeding productivity expectations. Must safeguard patient privacy and confidentiality by following the guidelines set forth in the Privacy and Security Rules of the Health Insurance Portability and Accountability Act (HIPAA).Manage day to day activities of patient and health care provider support requests and deliverables across multiple communication channels i.e. Fax, Chat, eMail, etc.
Perform intake of cases and capture all relevant information in the Case Management system Ensure all support requested is captured within the Case Management system and routed to appropriate next step using decision tools and reference guides Ensure timely and accurate processing of requests including reviewing source documentation Escalate complex cases, when appropriate Maintain excellent quality standards for all client programs; adhere to program requirements and guidelines. Accurately transcribe and document information received via form into client databasesJob Holder Specification:
High School Diploma required Bachelor’s degree or equivalent work-related experience preferred. Excellent verbal, written and listening communication skills. Knowledge of reimbursement mechanisms for medical and pharmacy benefits, patient access processes and patient assistance programs: operational policies and processes preferred. Proficiency in reviewing intake documents thoroughly and entering information in database with little to no errors. Proficiency with Word and Excel Analytical thinking, problem solving and decision making. Ability to multitask and manage multiple parallel projects with strong time management skillsOfertia is a digital company based in lively Barcelona. Our mission is to revolutionize the way people do shopping, covering the gap between the online and offline shopping worlds. We help retailers to drive more customers to their stores by publishing digital circulars on our mobile apps and web portals, where consumers can find the best offers from their favorite stores around them.
Ofertia is now part of the Mediapost Group with strong presence in Spain. Moreover, we operate in Mexico, Colombia and Sweden.
We are looking for an experienced Data Scientist (m/f) who is, like us, passionate about delivering the best product and user experience to our customers. As a key member of our cross-functional and agile team setup, you will leverage your skills and ability to extract valuable insights from various data sets. You will work with a variety of challenges that touch all parts of the business. Our goal is to design a better product that customers will love, enabling us to get meaningful data for us to make the most accurate decisions.
What would you do at Ofertia?
Identify the best state-of-the-art algorithms and libraries to solve complex problems.
Ad-hoc data analysis to answer critical business questions and to identify potential growth and development opportunities.
Enriching the company’s data with third-party information when needed.
Improving data collection procedures to include information that is relevant for building better analytic systems.
Collaborate and contribute to functional, cross-functional groups, initiatives on methodologies, innovations, technology, IT infrastructure, etc. to enable broader and more effective use of data.
Improve our internal data science tools and frameworks.
Requirements
Strong programming skills in Python
3+ years of prior experience in a Data Scientist role including statistical modelling, simulation and analysis, machine learning algorithms like predictions, recommendations, object recognition
Good knowledge and experience in image recognition, text extraction algorithms
Good knowledge of libraries like Pandas, PuLP, Matplotlib, seaborn, scikit-learn
Experience working with Linear Programming and Mixed-Integer Linear Programming
Experience with TensorFlow 2
Good knowledge of SQL
Comfortable with Data Engineering skills
Good knowledge of AWS Services like EC2, S3, Data Pipeline
Basic knowledge of third-party API integration to read data from external sources
Experience in executing data science projects
Personal skills
Excellent communication and collaboration skills, able to explain complex effects and impacts of ML/DL insights in simple business terms. Entrepreneurial mindset, self-starter, and ability to operate independently
What makes working at Ofertia great
The opportunity to bring our IT platform to the next level.
Ofertia is one of the most exciting digital companies in the heart of Barcelona with a disruptive product.
We support your personal and professional development with challenging projects and training programs.
Flexible working hours and possibility to work 100% remotely (within Spain).
A company culture driven by pioneer-thinking and talent that crosses departments through flat hierarchies and short communication channels.


datauk
Who we are, what we do & why we do it
We are Dext. Our suite of tools makes accountants more productive, profitable and powerful. In doing so we give them back the most precious commodity, time, which they can then use to add greater value to their clients.
Accountants and bookkeepers are the backbone of every successful business. For more than a decade we’ve empowered our partners with innovative technology solutions to make businesses better. Dext allows them to meet the challenges they face today, tomorrow and in the future.
We are now seeking an experienced Product Manager to help with our continuous efforts to improve the level of automation we bring to our extraction processes.
The role (what you’ll do):
- Manage the data extraction lifecycle, you will be responsible for determining what 'success' looks like taking into consideration a number of macro and micro factors
- Research new ways to improve our automation processes; working closely with our machine learning lead and lead data scientist on requirements, data capture, sampling..ultimately using the tools at your disposable to spot actionable insights
- Channel the voice of the customer to ensure data outputs are in line with user goals
- Evaluate and implement new product ideas
- Work closely with the wider product team and stakeholders to align on product releases and team roadmaps
- Ensure that stakeholder needs are considered and evaluated during the development stages
About you (what we are looking for):
- Equally comfortable using reporting tools such as Looker, Snowflake, SQL as well as building simple data science models in a notebook.
- Successful candidates in the past have come from a data science background or product analysts who have applied data science techniques.
- Ideally previous Product Management experience in a fast-paced technology business. Some experience of advising/making product led decisions is needed.
- Ideally, previous experience in a SaaS, rapid growth environment
- Capable of making data driven decisions
- Stakeholder management
- Managing and prioritising a backlog
These are our ideal requirements, but we hire on potential, not just on experience, and we know that some people are less likely to apply for a role if they don’t meet 100% of the criteria. At Dext we are committed to cultivating a erse, inclusive and empowering culture, so please apply if you meet the majority of these competencies.
You can read more about our Diversity & Inclusion commitments here.
What you will be a part of:
We are a highly ambitious, innovative, market-leading FinTech. We are a global, well-funded business but have the dexterity and pace of a scale-up. We are uncompromising in our desire to achieve our best, day-in day-out, and we have three clear values which guide everything we do:
Be Brave - Everyone in the company has a voice to challenge ideas and the status quo.
Be Exceptional - We set high standards for ourselves. We aim to be exceptional at what we do.
Be Together - We are one team. There is no such thing as inidual success without team success.


datauk
DirectID is growing fast and thoughtfully. We are a remote first organisation and have been recognised as one of Scotland’s top 10 most flexible employers We know we're not just building world-class products but also a world-class team and a set of sustainable development practices that will continue to deliver value as we scale.
We believe that the best work comes from cross-functional, self-organising, and erse teams that take ownership of their processes and practices. You will be trusted to make things happen.
Our people set us apart and these are some of the words they use to describe our culture: Flexible, Fast, Compassionate, Potential, Professional, Fun.
Our values have grown organically through our behaviours and outcomes, driving everything we do We’re a Clan, With one Shared Vision, For our Customers and Brave at Heart.
YOUR ROLE
As a data analyst, you will work within our Data Insights team, close to our data scientists and product team. Your work will help us improve our analytical solutions, with your work having a direct positive impact to our customers experience.
We are looking for a Data Analyst, who will be instrumental in leading improvements to our machine learning models drive by annotation. You'll help build and improve our Machine Learning models and the processes related to their performance. Dealing with huge datasets, you will have ownership over the annotation and visualization tasks assigned to you, which will help shape our existing and future products.Requirements
YOU NEED TO HAVE
Some prior experience in linguistic annotation (text data annotation)
Demonstratable Power BI Experience and data visualization skills
A passion for data, efficiency, and accuracy
A curious nature, not being biased by previous knowledge
Experience delivering in data science/engineering environments.
Demonstratable experience and understanding of PowerBI
Some experience using a variety of data mining/data analysis methods.
Some experience with C#, Azure ML, SQL
Proactive management of inidual tasks alongside an open collaboration with the wider team.
Benefits
THE PERKS Benefits may vary depending on location.
- A team of passionate interesting people committed to your success.
- Challenging problems to solve.
- We're a growing company your contributions will be valued.
- £35k gross/pension/EMI share scheme.
- Continuous Professional Development budget (CPD).
- Uncapped bike to work scheme.
- Monthly recharge time.
- Clan events and workshops.
- Generous holiday allowance (we will insist you take it!).
- Home working contribution to set up (get comfy we want you to stay)
- Ask us about Flexible Working
REDEFINE AN INDUSTRY
We are on a mission to enable our customers across the globe to effortlessly make use of bank data to; better understand their customers, grow their business, revolutionise their offerings and delight with customer service.
At DirectID you will be working for a business that is redefining the industry for lenders and their consumers through our global credit & risk platform. This is an exciting stage in our growth and we’d love you to be part of the story.


data🇺🇸usa only
MEMX is looking for a passionate, self-motivated and hard-working Data Engineer to join a fast paced and highly technical team. You will collaborate across many different functions who are end users of the data platform. You will contribute to innovative analytics and data utilization projects that will continue to differentiate MEMX from the competition and support regulatory compliance.
< class="h3">Responsibilities:
- Developing and maintaining queries.
- Developing and maintaining operational dashboards.
- General Administration of Data Platform software and servers.
- Generation and submission of reports.
- Resolving data platform issues.
- Scripting and building automation tools
- Partner with Operation, Dev, Member Experience team and others to understand data requirements
- Design, build and launch extremely efficient and reliable data pipelines to move data across a number of platforms to meet daily and monthly reporting requirements.
Requirements
- 3+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets.
- Programming experience in building high quality software.
- Java experience preferred
- Linux Scripting Experience
- SQL Experience
- Experience with designing and implementing real-time pipelines
- Experience with data quality and validation
- Experience designing data schemas
- 1+ year(s) of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
- Demonstrate ability to work well independently and within a fast-paced, collaborative environment
- Demonstrated communication and interpersonal skills to work across erse stakeholders and cross-functional teams
Benefits
Members Exchange (MEMX) is a growing FinTech firm founded by a group of leaders in the world’s financial markets and is currently the fastest growing U.S. equities exchange. Our people are the foundation of our business, and we are committed to maintaining the culture we have set in motion. We take great pride in our selection process — and that starts with finding the right people. At MEMX you will have the ability to work with a talented team of professionals who bring ersity of thought and background. You will have the opportunity to shape the future of our company and the impact MEMX will have on our clients and the broader markets. We offer competitive employee benefits and perks and will continue to make this a priority to attract the best.
· Fully Remote Workforce*
· Health Care Plan (Medical, Dental & Vision)
· Company Sponsored Retirement Plan
· Unlimited PTO
· Paid Family Leave
· Short-Term & Long-Term Disability
· Training & Development
· Wellness Resources
*Current list of approved remote work states:
- Connecticut
- Florida
- Illinois
- Kansas
- Maine
- New Jersey
- New York
- North Carolina
- Pennsylvania
- South Carolina

< class="h1">Description
< class="h4" id="Responsibilities" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Responsibilities
< class="h4" id="Qualifications" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Qualifications
This is a remote position.
< class="h4" id="Overview" style="margin: 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">OverviewAnant is a destination employer for high-performing, erse, global talent. Our Data Engineers support the development, operation, and maintenance of real-time data processing. They oversee and deliver the success of client and internal projects. The Data Engineer will not only support our internal team, but will also participate in client project work including design of novel systems, debugging performance degradations and read/write latencies, audits, monitoring, and health checks. An ideal Data Engineering candidate will have experience supporting rollout of migration tooling through client environments by troubleshooting GKE, Airflow, Dataproc, DataStax Enterprise, and DataStax Astra. Other candidates will gain experience using these tools.
We are looking for a Data Engineer to join our team immediately. We look for the best and brightest and those willing to learn.
Soft Skills
-
Demonstrate a passion for excellence in work product and customer delivery
-
Create and deliver live and recorded demos for customers and internal stakeholders
-
Familiarity with the enterprise data platform ecosystem
-
Continuous learning mindset
Hard Skills
-
Troubleshoot and support rollout of tooling and services that use Airflow (on K8s), Spark (managed), DataStax Enterprise, and DataStax Astra,
-
Create, troubleshoot, and refactor Python DAGs,
-
Create and deploy infrastructure as code via Ansible and Terraform,
-
Demonstrate familiarity with creating and destroying resources on GCP, including GCP monitoring dashboards,
-
Demonstrate an aptitude for RCA and troubleshooting code and systems integration issues, and
-
Familiarity with Scala, Python, and Java.
-
Conduct rapid POC Development and be able to transfer knowledge to others
Most Wanted Qualifications
-
Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure
-
2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc.
-
3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm
-
3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE)
-
5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform
< class="h4" id="Responsibilities" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Responsibilities
-
Work with multiple teams and multiple projects (e.g., application, infrastructure, cloud, etc.) to
-
Complete requests (adding new or decommissioning existing clusters)
-
Debug and resolve issues
-
-
Utilize project management software (e.g., Jira) to log time and resolve tickets
-
Create and update SOP’s, Runbooks, issue reports, and other documentation as required
-
Consult on client projects, maintain client confidentiality and protect client operations by keeping information confidential
-
Contribute to team effort by using effective communication skills, being a self-starter, and taking responsibility for deliverables
< class="h4" id="Qualifications" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Qualifications
-
BS degree in Computer Science or related technical field involving coding, or equivalent practical experience
-
Ability to troubleshoot, debug and optimize code
-
Ability to identify and automate routine tasks
-
Incident management and root cause analysis
-
Strong organizational, time management, and detail skills.
-
Strong communication and interpersonal skills, able to comfortably and pleasantly deal with a variety of people
Working at Anant
-
Anant performs business around the clock, but some availability during during US Eastern Time business hours is important.
-
Anant is a 100% remote workplace.
-
Anant is currently looking to hire part time, with future full-time work available.
< class="h3">Benefits < class="h2" id="About-Anant" style="margin: 1.8em 0px 0px; padding: 0px; font-style: normal; line-height: 1.2; color: #172b4d; font-weight: 500; letter-spacing: -0.008em; text-transform: none; border-bottom-color: #cccccc; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">About Anant
Anant is working to become the authoritative market leader in business platforms. Most technology leaders have a hard time retaining the experts to help them build and manage global data platforms because of the high costs of specialized talent. We created a training program for client teams and a network of trained specialists on our framework who are available on a full, part, or on a project by project basis.
Anant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
< class="h1">Skills
Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure 2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc. 3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm 3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE) 5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform
< class="h1">Experience4-5 years

< class="h3">Company Description

NBCUniversal owns and operates over 20 different businesses across 30 countries including a valuable portfolio of news and entertainment television networks, a premier motion picture company, significant television production operations, a leading television stations group, world-renowned theme parks and a premium ad-supported streaming service.
Here you can be your authentic self. As a company uniquely positioned to educate, entertain and empower through our platforms, Comcast NBCUniversal stands for including everyone. We strive to foster a erse and inclusive culture where our employees feel supported, embraced and heard. We believe that our workforce should represent the communities we live in, so that together, we can continue to create and deliver content that reflects the current and ever-changing face of the world. Click here to learn more about Comcast NBCUniversal’s commitment and how we are making an impact. < class="h3">Job DescriptionWelcome to Peacock, the dynamic new streaming service from NBCUniversal. Here you’ll find more than a job. You’ll find a fast-paced, high-flying team for unique birds that want to be at the epicenter of technology, sports, news, tv, movies and more. Our flock works hard to connect people to what they love, each other and the world around them by creating shared experiences through culture-defining entertainment.
As a company, we embrace the power of difference. Our team is committed to creating an organization that champions ersity and inclusivity for all by curating content and a workforce that represents the world around us. We continue to challenge ourselves and the industry by being customer-centric, data-driven creatures of innovation. At Peacock, we are determined to forge the next frontier of streaming through creativity, teamwork, and talent. Here you can fly to new heights!As a member of the Peacock Video Quality & CDN Team, you will perform complex data analysis, design and create queries across multiple data sets and make them available for consumption through dashboards and/or ad hoc reports.
Working alongside Data Engineers and CDN Architects, you will deliver key insights and analytics that will help make better business decisions, improve KPIs and deep e on issues particular to video and ad delivery systems.Responsibilities
- Provide daily trend analysis on Core video KPIs
- Serve as the regional Video Metrics SME, interfacing with other technology domains and providing expertise to support, development and operations teams
- Work with front end RUM and backend KPI data
- Work with ML-based alerting and anomaly detection systems
- Play an integral role in the contribution towards the international supplier choice and commercial management strategy
- Bachelor’s degree in Computer Science, Information Technology or a relevant field
- Minimum five (5) years of experience working with video or similar data sets
- Solid understanding of telecommunication & Internet technologies
- Solid understanding of IP video chain (e.g. encoding, packaging, origin later; network integration)
- Experience in video performance management
- Ability to oversee communications concerning CDNs including presentations to executives and building detailed design documentation
- Experience with creating data visualizations to communicate metric performance.
- Experience with statistical analysis or data modeling techniques.
- Experience building data infrastructure and implementing reporting solutions.
- Experience with Google and Amazon Cloud tools is a plus
- Ability to manage multiple projects simultaneously and deal with ambiguity
- Excellent verbal and written communication skills
- Background in utilizing SQL/Python to extract, manipulate, and analyze datasets.
Desired Characteristics:
- Strong analytical skills with a ‘join-the-dots’ approach
- Knowledge of data analytics, modelling and statistical analysis
- Experience working as part of a geographically distributed team
- Proactive, independent, and able to articulate to a technical and non-technical audience.
- Should be a problem solver with an open mind and an eagerness to pick-up new skills
- Minimum five (5) years of experience working with video or similar data sets
- Solid understanding of telecommunication & Internet technologies
- Telecommunications industry experience (or similar e.g. academia). BSc / MSc / PhD in Computer Science, Electrical Engineering or similar
NBCUniversal's policy is to provide equal employment opportunities to all applicants and employees without regard to race, color, religion, creed, gender, gender identity or expression, age, national origin or ancestry, citizenship, disability, sexual orientation, marital status, pregnancy, veteran status, membership in the uniformed services, genetic information, or any other basis protected by applicable law. NBCUniversal will consider for employment qualified applicants with criminal histories in a manner consistent with relevant legal requirements, including the City of Los Angeles Fair Chance Initiative For Hiring Ordinance, where applicable.
If you are a qualified inidual with a disability or a disabled veteran, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access nbcunicareers.com as a result of your disability. You can request reasonable accommodations in the US by calling 1-818-777-4107 and in the UK by calling +44 2036185726.

Who we are:
We're an ambitious team of researchers, builders, and entrepreneurs with many years of experience in the crypto space and are passionate about digital assets. With a new product roadmap, we are adding several new products to our portfolio, beyond on-chain data - we are becoming a more comprehensive go-to crypto platform for various needs.
Glassnode is a leading market intelligence platform for on-chain and derivatives data. We offer the most comprehensive library of advanced on-chain metrics and insights for Bitcoin, Ethereum and hundreds of other digital assets, bringing data intelligence to the blockchain and cryptocurrency space.
With an emphasis on user-centric interfaces, we aim to deliver intelligible, actionable, and unique insights from blockchain data and beyond, helping investors make educated decisions in an overly emotional and irrational market, and equipping blockchain teams with valuable insights on their projects and tokens.
Senior Data Engineer. Remote. (m/f/d) (m/f/d). Remote. An employment contract or consultancy is possible.
About the role:
Extend the crypto data engineering frontier by embarking on a journey of shaping our innovative, industry-leading products (powered by accurate and reliable data).
We are searching for a driven and experienced Data Engineer ready to contribute to taking our products to the next level. You will collaborate closely with our data science and engineering teams and work on our crypto data platform, efficiently pulling data from a range of sources, creating and maintaining data pipelines that ultimately shape and enrich our product offering
Join us remotely, you can be located anywhere around the CET time zone, as our work is 100% online. The position requires a full-time engagement; both a Contract and an Employment Agreement are possible.
< class="h3">What you will do:

- Extract, combine, transform and store data from blockchains, exchanges and other sources utilizing modern data processing frameworks
- Maintain, improve and extend our data pipelines
- Independently identify new ways to arrange and enrich data to enable novel research and product directions
- Work closely together with our data scientists to get the most out of our on-chain and crypto-financial data
- Introduce the necessary tools to monitor and QA our data pipeline infrastructure
- Collaborate with our team of backend engineers and accompany code from ideation to implementation
- 4+ years working as a data engineer or a relevant field
- Extensive experience working with high-performant ETL data pipelines using state-of-the-art methodologies and tooling
- Advanced knowledge of large-scale data analysis with a great understanding of scalability and robustness
- High Proficiency in Python and its open-source data science ecosystem (pandas, numpy, matplotlib, ...)
- Very good knowledge of SQL (Postgres) and BigQuery
- Solid understanding and experience of microservices (e.g. using Docker and Kubernetes ) as well cloud infrastructure (preferably GCP)
- Passionate about Bitcoin and the crypto industry
- Working experience with financial (other time-series) data
- Experience with Graph analysis and graph databases
- Experience with Argo workflows
- Experience with golang and/of (py)spark
- A modern technical stack with an emphasis on quality
- Flexibility to organise your work and hours the way you like, our remote-first setup enables this for everyone
- Join a young, self-funded and already profitable company in a future-proof market
- Freedom to own your decisions and experiment, we need driven experts who help us figure out what to do, not for us to point at what needs to be done
- Be one of the main contributors to building a company with its unique culture in the cutting-edge tech space
- No approval loops or unnecessary processes, quick decision-making and full ownership of your function
- We are working on a kick-ass, meaningful benefits package, something truly useful and empowering - you will help us figure out what we'd need to stand out as the workplace of the future, we are open to any ideas
- Frequent company offsites, we love remote but love to have fun together too - this year we went to Lisbon for a week, we're also coming to Switzerland for our Xmas party and more adventures are coming

< class="h1">Description

This is a remote position.
The mission of the Processing Team is to build cross-domain systems to perform RF-based data collection and geolocation. The Processing team includes experts across FPGA development, embedded software, software defined radio, and cloud development; plus deep knowledge of signal-of-interest (SOI) digital signal processing, RF communications systems, RF measurement systems, and geolocation. Our client is currently seeking a Senior Data Engineer who can help the Processing team design, build, and deploy data pipelines for RF processing and geolocation.
As a senior data engineer on the Processing team, you will be responsible for designing and implementing distributed, reliable backend software systems to consume and leverage RF data at scale. You will need experience building and running data pipelines in production, with a passion for robustness, observability, and monitoring. A successful data engineer will be expected to work closely with RF & Geolocation domain specialists, data scientists, and analysts to deploy pipelines while optimizing for both performance and low-latency. We support a broad range of software to accomplish our mission, especially favoring python and C++ for backend software; Kubernetes clusters on AWS; data pipelines orchestrated with Airflow; data storage with Amazon S3 and PostgreSQL as appropriate; Elasticsearch and Kibana for logs analytics and monitoring dashboards. Location: This position can be hybrid with work from home flexibility or 100% remote. Your main responsibilities will be:- Contribute to the design, implementation, and testing of the company's data platform and data pipelines; optimizing for scalable, low-latency deployment within a batch-processing cloud environment
- Build, document, and support software systems & tools (data pipelines, utility libraries, core features, etc) enabling high-quality research and production deployments throughout the team
- Define scope, build consensus within the technical team, and drive new feature development with input from stakeholders throughout the company
- Participate in collaborative & fast-paced software development practices, particularly performing merge request reviews, providing design feedback, etc
- Guide and mentor other inidual contributors; work closely with RF & Geolocation domain specialists to achieve the team mission
< class="h3">Requirements Education and experience:
- B.S. degree in Computer Science or comparable or equivalent experience
- 6+ years of professional experience
- 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc
- Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc
- Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging
- Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc.
- Track record of building and supporting mission-critical backend applications in production
- Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider
- Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products
- Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time
- Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications
- Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment
< class="h1">Skills
Education and experience: B.S. degree in Computer Science or comparable or equivalent experience 6+ years of professional experience 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging Essential: Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc. Track record of building and supporting mission-critical backend applications in production Desirable: Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment Company Overview: Our client is delivering a revolutionary source of global knowledge based on radio frequency (RF) geo-spatial analytics to those working to make the world a safer place. The company operates a commercial satellite constellation that detects, geo-locates, and identifies a broad range of signals & behaviors. We employ cutting edge AI techniques to equip our global customers with high-impact insights needed to make decisions with confidence. Headquartered in Herndon, Virginia. The client is committed to hiring and retaining a erse workforce. They are proud to be an Equal Opportunity Employer, making decisions without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, marital status, national origin, age, veteran status, disability, or any other protected class.
< class="h1">EducationBachelor's degree

At Abusix, we are remote working pioneers and thought leaders fighting network abuse and other cyber threats on a global scale. We build an ecosystem that enables everybody on the internet to transform security-related data into actionable information, transport it to the right place, and help mitigate it as fast as possible. Together with our amazing customers, some of the biggest Internet Service Providers, Datacenter Operators, and Enterprises, we do make the internet a safer place.
OUR VISION is to build a global platform for all internet stakeholders where data can be shared and made actionable to contribute to a safer online environment.
OUR MISSION is to fight network abuse on a global scale.
About The Role
Our engineering team is looking for a Senior Engineer Data Processing (gn) with a high degree of self-initiative, excellent communication skills and a huge eagerness of taking on responsibility. You join our Data Services team that is developing the software for processing all data required to run our self developed software solutions in a remote team environment. The main goals are setting the technical direction for the team, processing and adaption of high-volume parallel data streams with modern and scalable technologies like Apache Kafka, Apache NiFi and NSQ. In this position you are responsible for maintenance and enhancement of our existing systems, architecture planning and testing. By taking on responsibility and contributing your own ideas, you will bring our products to the next level.
< class="h3">What You Will Do

- Maintain and extend existing and new applications to provide data and insights for different teams and products, including customers.
- Discuss, plan, segment incoming requirements from customers and other teams to realize new features and achieve the best possible outcome.
- Research new and alternative technologies to solve problems efficiently.
- Guide and support students and juniors in their work.
- Test and review your own and code of coworkers to minimize maintenance and increase code quality.
- Maintain documentation to allow other company members understanding your projects and to make troubleshooting easier.
- Contribute to a continuous improvement of the development process in the team.
- Provide assistance to other teams when interacting with data supplied by the data team to achieve useful outputs.
- Your main values match ours: ownership, collaboration, innovation, respect and authenticity.
- Min. 5 years experience in software engineering and data processing.
- Proactive personality who seeks information and is thinking outside the box.
- You write correct, elegant, flexible and performant code to keep your own bug fixing efforts low.
- Eager to take on responsibility, intrinsic motivation and reliability.
- Experience working with high level programming languages and test frameworks.
- Deep understanding of parallel and distributed systems.
- Familiar with challenges of high-volume stream and data processing, incl. message broker concepts (Apache Kafka).
- Deep knowledge of software engineering principles and agile development processes.
- High level of experience working with databases (SQL, NoSQL, Cache).
- Familiar with AWS, Kubernetes, Docker.
- Fluent in English, German is a plus.
- Valid work permit for country of employment.
- Work with the latest technology, in the cloud and with the latest frameworks.
- Share defined projects open source.
- An extremely steep learning curve.
- Impact, responsibility and participation in a very interesting field.
- Competitive pay according to your country and meaningful equity.
- Home office equipment or coworking space membership contribution.
- Flexible working hours.
- Fast-paced high-tech B2B company.
- Conference and travel budget.
- Fun team events.
- Additional benefits depending on your place of living.
- Ownership - Everybody takes the seat as the Co-CEO of Abusix to drive results proactively.
- Collaboration - We genuinely believe that we are stronger together as a team.
- Innovation - We love to challenge the status quo and want our people to be brave and embrace failure in a safe environment.
- Respect - Our daily interactions are guided through respect for each other, our customers, and the environment.
- Authenticity - We are true to our personalities and values, have fun at work and don't take ourselves too seriously.

< class="h3">Company Description

Are you:
- A multi-faceted all-star who loves streamlining the accurate flow of data from one system to another through a combination of low-code tools and custom scripting?
- Someone who loves data visualization and has a knack for presenting data elegantly using modern Business Intelligence tools?
- Someone who loves details, and leaves no stone unturned when trying to solve a problem or check your work?
- Someone who has a unique combination of accounting, data science and engineering skills, but still operate with warmth, compassion and humility in everything you do?
If so, you might be the right person to work directly with the CEO to tackle some of the firm’s most challenging problems – how to efficiently access, clean, and move large amounts of data between software systems both internally and for the clients.
< class="h3">Job DescriptionSupport Existing Customer Automations
- You will quickly get up to speed on the data pipeline our client has built via Python, JavaScript, custom macros in Excel and Google Sheets and low-code tools
- You will manage tools that access customer data in SaaS apps via API or sometimes through manual report downloads when APIs are not available
- Your accounting knowledge will help you understand the data manipulation they must do before pushing customer data into QuickBooks Online via its API. You can then check your work with your understanding of Journal Entries and Debits and Credits
- In learning and supporting the existing automations they’ve built, you’ll be immediately looking for ways to improve the overall systems – using new tools or improved safe-guards to best serve customers’ needs
Support Our Client’s Existing Automations
- Similarly, you’ll support existing critical process automation they’ve built internally with Python, Google Sheets, APIs and data warehousing in Google Big Query. This work supports critical parts of monthly operations including billing, profit-sharing, etc.
- Working directly with the CEO, you’ll streamline both the creation and presentation of various internal reports we use on a weekly basis
Tackle New Automation Opportunities
- You’ll work directly with the team to seek out opportunities to further automate data services for clients. This will include seeking out automation opportunities outside of finance and accounting workflows. This may include automating parts of their marketing processes, field scheduling, inventory purchasing, etc. Over time, you’ll build out a team to scale the process automation services for clients
- Internally, you’ll work directly with the CEO to help implement and integrate a new CRM into their proposal and workflow management tools via thoughtful use of tools like Zapier. You’ll tackle other interesting challenges internally including more building systems for more real-time margin presentation and budget-to-actual analysis
- Fluency in Python and JavaScript and a strong understanding of low-code tools such as Zapier and Integromat. What you don’t know, you can learn quickly
- Strong familiarity with accounting, including an understanding of how a Balance Sheet and P&L work together. You are familiar with Journal Entries, Debits and Credits, and QuickBooks Online
- Access QuickBooks data, and connect it in relevant ways that make businesses easier to run and more profitable.
- Obsession with documentation and adherence to process. You love creating and following structured workflows, and get excited by helping others adhere to consistent processes
- Humility, patience, and a deeply rooted servant leadership mentality. You love to “get in the weeds” and avoid regular micromanagement
- Excellent communication skills – with both clients and team members across multiple communication channels (email, MS Teams, Zoom, etc.)
- Ability to quickly build expertise in cloud accounting technologies (QBO, Bill.com, Gusto, Divvy/Ramp, Expensify/Tallie, Salesforce, Google Drive, etc.)
- Resourcefulness – an expert problem solver who is not afraid to to ask for help when needed
- A keen attention to detail – with a sense of integrity, dependability, and joyfulness
What’s In It for You:
- 100% remote work – but you’ll also get to travel to conferences and to a once-a-year company gathering. Other key strategy sessions throughout the year will also be held in-person
- Generous compensation – including health and dental insurance, retirement matching, PTO, maternity/paternity leave, annual-wellness benefits and much more
- Joyful culture – team members love what they do, and that infectious energy permeates everywhere throughout the company.
- Endless growth opportunities – this is a fast-growing company on a decades long journey of evolution that loves team members who have a similar growth mindset and horizon
About The Firm:
-
We are a fast-growing, cloud finance and accounting services firm founded on two equal missions:
- Providing team members with their Dream Job. This means building a workplace where team members experience joy daily, are cared for and respected, afforded flexibility, given room to grow professionally and are compensated at the top of the industry
- De-stressing and freeing up clients via excellent and modern cloud-based finance and accounting services. We handle accounting, payroll, bill-pay, invoicing, reporting and other needs for customers so they can focus on their business, organization, family or other priorities.
If you feel you have the necessary qualifications to perform this job, please forward a current copy of your resume and state your salary requirements.

< class="h3">What do we do?

Paddle offers SaaS companies a completely different approach to their payments infrastructure. Instead of assembling and maintaining a complex stack of payments-related apps and services, we’re a Merchant of Record for our customers, taking away 100% of the pain of payments fragmentation. It’s faster, safer, cheaper, and, above all, way better.
In May 2022, we joined forces with ProfitWell. ProfitWell provides BI solutions that improve retention and monetization automatically through unmatched subscription intelligence. As one team and one platform, we offer the "done for you" approach to SaaS payments, billing, and growth.
We’re backed by investors including KKR, FTV Capital, Kindred, Notion, and 83North and serve over 3000 software sellers in 245 territories globally.
< class="h3">The Role:
The Data & Insights team are building the foundations to support Paddle's growth and demand for commercial data. This is an exciting time to join, where you will be part of a team migrating to a scalable BI stack and driving proactive business insights to impact our Go-to-Market strategy.
The Data & Insights team are responsible for all aspects of data modelling in the commercial functions, including but not limited to: business forecasting, revenue planning, demand generation pipeline, and ABM reporting.
The role of the Data & Insights Analyst is to support commercial functions in understanding their data, both on an ad hoc and project basis. This includes capturing and understanding business requirements, designing and deploying appropriate data models and presenting insights derived from this data to enable operational performance.
< class="h3">What you'll do:
- Contribute to the development of Data and Insights projects end-to-end:
- Work on scoped briefs under the guidance of the Data and Insights Manager and Senior Analysts.
- Work with CRM, Marketing data, product data, .csv files and several other data sources through usage of Excel and SQL.
- Create best-in-class dashboards and visualisations to drive maximum insight from Paddle’s data.
- Collaborate with other members of the Data & Insights, and broader Revenue Operations team.
- Become a subject matter expert on Paddle commercial data.
- Collaborate on the development of self-serve initiatives across the business to support both technical and non-technical teams with data discovery and drawing actionable insights.
- Contribute to strategic planning by deriving insights and recommendations from projects and reporting, to help shape the direction of the commercial business. Including but not limited to:
- Business Intelligence dashboarding.
- Data discovery.
- Development of industry knowledge and best practices.
- Customer revenue modelling
- Strong commercial experience in SQL and Excel. Experience with Python is a plus.
- Experience driving insights from complex datasets.
- Strong data visualisation and presentation techniques.
- Exposure to BI and ETL tools, such as Sisense, Looker, Tableau and PowerBI. Working knowledge of dbt and Snowflake is a plus.
- A proactive approach to challenges and are comfortable operating in an unstructured environment
- Proven project and stakeholder management skills
< class="h3">Everyone is welcome at Paddle
At Paddle, we’re committed to removing invisible barriers, both for our customers and within our own teams. We recognise and celebrate that every Paddler is unique and we welcome every inidual perspective. As an inclusive employer we don’t care if, or where, you studied, what you look like or where you’re from. We’re more interested in your passion for learning and what you’ll bring to the table. We encourage you to apply even if you don’t match every part of the job ad, especially if you’re part of an underrepresented group. Please let us know if there’s anything we can do to better support you through the application process and in the workplace. We’re committed to building a erse team where everyone feels safe to be their authentic self. Let’s grow together.
< class="h3">Why you’ll love working at PaddleWe are a erse, growing group of Paddlers across the globe who pride ourselves on our transparent, collaborative and respectful culture. We live and breathe our values, which are:
Exceptional Together
Execute with impact
Better than Yesterday
We offer a full suite of benefits, including attractive salaries, stock options, retirement plans, private healthcare and wellbeing initiatives.
We are a ‘digital-first’ company, which means you can work remotely, from one of our stylish hubs, or even a bit of both! We offer all team members unlimited holidays and 4 months paid family leave regardless of gender. We love our casual dress code, annual company retreats and much more. We invest in learning and will help you with your personal development via constant exposure to new challenges, an annual learning fund, and regular internal and external training.
< class="h6">#LI-REMOTE

datanorth america
As the Data SRE Manager you'll be responsible for managing the data infrastructure for all of Mozilla. You will work closely with Mozilla's Data Organization, which includes Data Engineering and Data Science teams, to develop and deliver a data platform to support Mozilla's data driven culture. Creative problem-solving skills and an innovative demeanor will be key to success.
As the Data SRE Manager you will...
-
Empower a hardworking team of SREs to grow in their roles and align the team’s work with Mozilla’s needs.
-
Own the infrastructure for the data platform, data tools, and integrations.
-
Manage and prioritize new projects, ongoing support, and technical debt.
-
Drive improvements from operations back into development and vice versa.
-
Measure, maintain, and report appropriate metrics across products to internal and external partners.
Your Professional Profile
-
Bachelor’s degree (or higher) in Computer Science and/or equivalent experience.
-
4+ years SRE experience, 2+ years SRE leadership experience.
-
Ability to accept direction and communicate effectively against multiple levels of management and technical expertise.
-
Solid experience in cloud environments, specifically AWS or GCP.
-
Working knowledge of database systems (SQL and/or non-relational).
-
Experience with ETL, data modeling, cloud-based data storage and processing, including GCP Data Services (Dataflow, BigQuery, Dataproc).
-
Experience in workflow and data pipeline orchestration (Airflow, Jenkins etc.)
About Mozilla
Mozilla exists to build the Internet as a public resource accessible to all because we believe that open and free is better than closed and controlled. When you work at Mozilla, you give yourself a chance to make a difference in the lives of Web users everywhere. And you give us a chance to make a difference in your life every single day. Join us to work on the Web as the platform and help create more opportunity and innovation for everyone online.
Commitment to ersity, equity, inclusion, and belonging
Mozilla understands that valuing erse creative practices and forms of knowledge are crucial to and enrich the company’s core mission. We encourage applications from everyone, including members of all equity-seeking communities, such as (but certainly not limited to) women, racialized and Indigenous persons, persons with disabilities, persons of all sexual orientations, gender identities, and expressions.
We will ensure that qualified iniduals with disabilities are provided reasonable accommodations to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment, as appropriate. Please contact us at [email protected] to request accommodation.
We are an equal opportunity employer. We do not discriminate on the basis of race (including hairstyle and texture), religion (including religious grooming and dress practices), gender, gender identity, gender expression, color, national origin, pregnancy, ancestry, domestic partner status, disability, sexual orientation, age, genetic predisposition, medical condition, marital status, citizenship status, military or veteran status, or any other basis covered by applicable laws. Mozilla will not tolerate discrimination or harassment based on any of these characteristics or any other unlawful behavior, conduct, or purpose.
Group: D
#LI-REMOTE
Req ID: R2002

Updated almost 3 years ago
RSS
More Categories