< class="h3">Data Engineer, Remote
< class="h3">Organizational Overview

Do you want to ensure that progressive organizations have access to the best possible data, technology, and technical expertise? We know movement organizing and advocacy are more urgent than ever and many organizations and activists cannot easily access the tools to scale their work.
Launched in 2018, The Movement Cooperative (TMC) is a nonprofit founded with the aim of creating a cooperative community addressing these structural challenges. Currently, our membership stands at nearly 75 national member organizations and 500 state-based organizations covered under our licenses. We leverage the collective purchasing power of our membership to secure cooperatively held data and technology licenses shared across membership and invest in common infrastructure like technical staff to support our members in their work. TMC is also a member cooperative -- our board is elected by and made up of our member organizations.
< class="h3">Position Overview:For this role we are seeking someone further along in their career who will help define best practices, and delve into larger, more complex, and/or more open-ended data engineering projects.
< class="h3">Key Responsibilities:- Help to define best practices, and delve into larger, more complex, and/or more open-ended data engineering projects.
- Daily monitoring of data infrastructure, including TMC-built ETL workflows to move data from vendor tools to TMC’s data warehouse (e.g. job re-runs, column re-sizes, etc)
- Communicating in a timely manner with TMC members about bugs, outages, and updates
- Investigating and resolving breakages of TMC-built ETL workflows
- Transforming datasets to improve usability for member data analysts
- Building and maintaining data products that combine data held by different TMC member organizations for shared, communally beneficial purposes
- Contributing to and designing team documentation
- Engaging the open-source Parsons community
- Building new ETL workflows leveraging Python as requested by TMC members
- Acting as first user tester for and providing input on new TMC software products
- Working with external software vendors when building and troubleshooting TMC data infrastructure
- Robust Python knowledge and abilities
- Proficiency in SQL
- Proficiency with git and GitHub
- Ability to foster communications clearly and succinctly
- Ability to explain technical concepts to non-technical audiences
- Inclination toward member focused customer service (i.e. technical support tasks)
- Ability to work independently and as a team player, to take initiative, and to manage multiple tasks and projects at a time
- Exceptional attention to detail, accuracy and consistency
- Familiarity with or participation in one or more open source communities
- Familiarity with Civis Platform, or other orchestration tools
- Experience in progressive political organizing, especially with political data
- Familiarity setting up and configuring services on AWS and/or GCS
- Familiarity with dbt, Prefect, or other commonly used data engineering/analytics engineering/data flow tools
- Experience working within a remote team or organization, or using remote communication software like Slack
- Prolonged periods of working on a computer
The Movement Cooperative is dedicated to providing all employees a competitive salary, world-class healthcare, dental and vision benefits, retirement, unlimited paid vacation and sick days, flexible holidays, and all-staff holidays, remote virtual office, paid parental and adoption family leaves, and health and wellness benefits. The salary for this position is $81,200 - $96,425. This is a full-time exempt position.
This position is within a union-represented collective bargaining unit, and specific terms and conditions of employment may be subject to a future collective bargaining agreement between TMC and the Union.
< class="h2">Next Steps:Please submit your resume and a cover letter detailing why you are interested in this role and your qualifications, in PDF format.
Here’s what you can expect from our interview process:
- Application Review
- Technical Assessment ~60 minutes
- Panel interview ~60 minutes
- Interview with CTO - 30 minutes
The Movement Cooperative recruits, employs, trains, compensates and promotes regardless of race, religion, color, national origin, sex, sexual orientation, citizenship status, disability, age, veteran status, and other protected status as required by applicable law.

Data Entry Assistant, HSPRS
Remote – US
Requisition ID: 1737
Reports To: Associate Director, Home Study and Post Release Services
Division: US Programs
Department: Children Services, Home Study and Post Release Services
Grade Level: 4
Salary Range: $21.36 – $28.90/hourly
FLSA & Union Status: Non-Exempt, Non-Bargaining
Introduction
Church World Service (CWS) is a not-for-profit organization working to eradicate hunger and poverty and to promote peace and justice around the world. CWS does not discriminate on the basis of race, color, national origin, sex, sexual orientation, gender identity, religion, age, disability or veteran status in employment or in the provision of services.
Primary Purpose: The Data Entry Assistant for Home Study and Post Release Services will support the work of CWS’ national program with regards to data entry and administrative support. The Assistant will serve as a key support role, and work in close coordination with the Case Processing team and Provider sites to ensure quality, timeliness, and accuracy in alignment with donor and CWS expectations. The Assistant is responsible to manage a high volume of data with time sensitive correspondence to a large network.
Essential Duties:
- Organize and upload resources to shared drive for access by national casework team
- Initiate new case creation in CWS database, ensuring accuracy and timeliness of client data
- Pull supporting documents from government database and upload to CWS database
- Organize and track new referrals in the shared HSPRS email inbox and prioritize time sensitive cases for entry into the database
- Communicate with provider sites, including case workers and case processing assistants about the status of data entry for new referrals
- Track case closures and ensure submission of final reports by the due date
- Upload final reports to government database
- Develop strong knowledge and comfort of use of various data tracking systems/databases and spreadsheets
- Assist with required donor reports by extracting and compiling data from CWS and donor database
- Other duties as assigned
Qualifications:
- Type 45 words per minute with 95% accuracy
Experience:
Minimum of 2-3 years of equivalent experience.
Skills:
- Spanish preferred but not required.
- Advanced Skills in Microsoft Office Products, including Excel, PowerPoint, and Word.
- Experience tracking large quantities of data.
- Experience in database and spread sheet management.
- Excellent data entry and data management skills.
- Excellent organizational skills.
- Strong analytical skills.
- Strong written and verbal interpersonal communication skills
- Comfortable working in a fast-paced environment while managing competing priorities.
- Ability to work in a multi-cultural environment.
- Commitment to ersity, equity, inclusion, and willingness to support CWS’ Platform on Racial Justice as a CWS employee.
Education & Certifications:
Associates degree required.
Special Requirements:
- CWS’ U.S. COVID-19 Vaccination Policy requires all new staff to provide proof of COVID-19 vaccination as soon as reasonable, no longer than 60 days of employment. Any employee who is pregnant, who is nursing, who has a disability, or who has a medical condition that prevents them from safe vaccination, or who rejects vaccination because of sincerely held religious beliefs, may contact the CWS Benefits Administrator to request reasonable accommodation as defined by the Equal Employment Opportunity Commission (EEOC).
- Able to sit for an extended period in an office setting. Reasonable accommodations may be made to enable iniduals with disabilities to perform the essential job functions.
- English proficiency required; Spanish proficiency strongly preferred.
- Remote position must have access to reliable internet.
- Must be in close proximity to airport
- Standard work hours with occasional evening or weekend assignments
- Occasional travel may be required to attend inperson meetings, conferences, and events
- Successfully pass PREA (Prison Rape Elimination Act), FBI and Child Abuse and Neglect Checks.
CWS offers a competitive benefits package that includes:
- 403 (b) Retirement Plan
- Medical, Dental and Vision Insurance
- Paid Time Off
- Life Insurance and AD&D
- Long Term and Short-Term Disability
- Employee Assistance Program (EAP)
- Health Savings Account
- Flexible Spending Accounts
- $250 professional development stipend (also can be applied to relevant professional licensure)
Who are we?
Meet Sleek.
We are an all-in-one digital platform helping entrepreneurs and investors to register their companies in Singapore, Hong Kong & Australia, and assist them with the management of their accounting, audit and taxes, in a 100% digital manner. We build cutting edge technology to automate manual work, and provide a delightful experience to our 5,000+ clients. Naturally, we are continuously looking for great people to join our dynamic team!
We are a team of 200 builders from 17 countries, with offices in Singapore, Hong Kong, Sydney & the Philippines, committed to delivering a great experience to our clients.
At Sleek, you will find a fast-paced environment focused on building great solutions, and endless opportunities for learning and knowledge sharing, whether it’s around a coffee with your colleagues at the office, or via video calls with your peers based overseas! If you believe in tech for good, and want to make a change to the way things are done traditionally, this is the place for you.
We are looking for our next Lead Data Analyst, to help us grow!
As a Lead Data Analyst for Sleek, you will be leading all analytics work related to marketing and sales, working with a data analyst and engineer.
Analyzing and working with both descriptive and predictive analytics, you will translate data into valuable insight, in turn aiding the business with your ability to directly influence change.
Your primary responsibilities will be to make data accessible to the relevant stakeholders, in a format they can activate. You will deploy a flexible analytical skill set to prepare, organize and deliver insightful data and analysis. You will support the marketing and sales departments in their performance analysis and their clients acquisition strategies.
You will ensure data accuracy from collection - through web analytics tools - to visualization and sharing.
You will help build data pipelines when needed - from inception and data collection, to analysis and presentation to the stakeholders, including reviewing tags and events collections.
You won't be afraid to approach new, unique problems, and you'll be capable of working with large amounts of data with high attention to detail.
Duties and Responsibilities:
- Improving the data collection setup of Sleek, including reviewing google tag managers tags and CRM automation and tagging
- Align with involved stakeholders to set up dashboards and reports to drive data driven decision across all departments
- Analyze data with appropriate mean - Python, SQL, Excel or other
- Share data through relevant channels and dashboards
- Propose and create innovative usage of data to improve clients’ experience, Sleek’s operations, Marketing and sales effort and Sleek’s pricing structure
Requirements
- At least 5 years of experience in a data position, with experience in SQL and data visualization
- Managerial experience and excellent people skills
- Self-starter, independent, motivated and result-oriented team player
- Possess the drive to overcome challenges in a start-up environment
- Fluency in English is mandatory
- Familiar with SQL and Python or R
- Familiar with data visualization tools - Tableau or equivalent
- Familiar with Google analytics, google tag manager or other web analytics tools
- Familiar with excel and / or Google sheets
- Detail oriented and data drive
Benefits
How about the Company & work environment?
● Our values are: Simplicity, Loyalty, Excellence, Entrepreneurship, Kindness
● We take corporate social responsibility very seriously at company level and absorb the yearly carbon footprint of all of our employees
● Multicultural team with 15+ nationalities and as many languages spoken internally!
● Work From Home Work Arrangement consisting of friendly and collaborative teams
● Provide Health & Benefits (HMO) and paid holidays
● We celebrate you: Sleek is an equal opportunity employer and welcomes applications regardless of race, ethnicity, religion, gender, age or identity.
Join the Sleek family today!


data🇺🇸usa only
about the senior data engineer role:
soona is looking for our first data engineer who will be part of the foundational team helping us build a world-class data organization. you will work closely with the data science, analytics, and engineering teams to create and maintain the data architecture we will use to advance our data science and analytics capabilities. to be successful in this role, we want you to have done this before and have the ability to take ownership of building a cutting edge data architecture.
this is a full time position that will report directly to the senior director of data science.
about tech at soona:
at soona, we’re focused on building a world-class engineering and data organization. we’re developing a highly-scalable platform for real-time customer engagement with our studio creatives and technology that optimizes the content they create. our typical engineering and data projects blend SaaS with e-commerce, providing opportunities to work on everything from app engineering and cloud/server architecture to computer vision and logistics/routing optimization. our tech stack consists primarily of ruby on rails, javascript vue, and python. we pride ourselves on our culture of innovation, community engagement, technical mentorship, and caring for the inidual.
about soona:
soona makes it possible for brands to create professional photo and video starting at $39. our studios give customers a playground for creating their content and our online platform makes it possible for any product company in the world to experience a remote shoot. we are creating a fast casual content revolution!
soona is currently supporting a remote work environment for this role; however, we are currently looking to hire in these states only: California, Colorado, Illinois, Massachusetts, Minnesota, New Jersey, New York, Texas and Washington with requirements to be able to work in mst or cst time zones. for flex hybrid work environments, we also have office spaces in Denver, Colorado + Minneapolis, Minnesota, if that’s your thing.
Requirements
an ideal candidate can:
- build data architecture and systems to populate an effective data lake/warehouse
- build ELT pipelines to ingest source data and transform it in a data layer for analytics
- develop data integrity tests to automate quality checks of data inserts
- create and maintain telemetry/observability systems to monitor data app performance
- think for themselves and discover new and insightful ways to solve difficult problems
- deliver quality code in an Agile framework that ships to a production environment
- communicate with data and engineering teams as well as business stakeholders
has experience in:
- effectively communicating and coding in a remote work environment
- python – the core language of the data organization
- building, maintaining, and owning ETL/ELT pipelines for data lakes and warehouses
- sql – consuming from OLTP, and optimizing OLAP architecture for analytics performance
- dbt – our primary data transformation layer
- aws or equivalent cloud environment
- looker, segment, airbyte, looker, docker, airflow, kubernetes (preferred)
- working in a startup environment (preferred)
Benefits
we can offer:
- strong starting salary: $140,000 - $160,000
- benefits & perks + unlimited pto + intentional culture
- really badass headshots

about the senior data analyst role:
soona is looking for a senior data analyst who will work closely with our business leadership team in using data to inform our decision-making in marketing, finance, and related b2b processes. to be successful in this role, you should have experience in working with common business concepts like CAC, retention, attribution, LTV, and the like. you should also excel at writing efficient sql queries to transform raw data and extract the resulting metrics, performing statistically-sound analysis to yield actionable insights, and visualizing those insights in a way consumable to stakeholders with a erse set of backgrounds/expertise. you will be part of our centralized data organization, where you will collaborate with data scientists and data engineers to create and maintain data transformations within our data pipeline and generate reports featuring your analytical results.
this is a full time position that will report directly to the senior director of data science.
about soona:
soona makes it possible for brands to create professional photo and video starting at $39. our studios give customers a playground for creating their content and our online platform makes it possible for any product company in the world to experience a remote shoot. we are creating a fast casual content revolution!
soona is currently supporting a remote work environment for this role; however, we are currently looking to hire in these states only: California, Colorado, Illinois, Massachusetts, Minnesota, New Jersey, New York, Texas, and Washington with requirements to be able to work in mst or cst time zones. for flex hybrid work environments, we also have office spaces in Denver, Colorado + Minneapolis, Minnesota, if that’s your thing.
about tech at soona:
at soona, we’re focused on building a world-class engineering and data organization. we’re developing a highly-scalable platform for real-time customer engagement with our studio creatives and technology that optimizes the content they create. our typical engineering and data projects blend SaaS with e-commerce, providing opportunities to work on everything from app engineering and cloud/server architecture to computer vision and logistics/routing optimization. our tech stack consists primarily of ruby on rails, javascript vue, and python. we pride ourselves on our culture of innovation, community engagement, technical mentorship, and caring for the inidual.
Requirements
an ideal candidate can:
- hit the ground running with marketing analytics, conversion measurement, customer engagement modeling, forecasting, and other analytics common to a b2b startup
- help create and maintain our multi-source data ingestion pipeline
- access data from our data warehouses to create datasets for analytics
- lean on a broad tool set of statistical techniques to analyze our data
- create clear and concise data visualizations to communicate analytics results
- deliver quality code in an agile framework that ships to a production environment
- correlate work to business KPIs and be accountable for improvement targets
- communicate with data and engineering teams as well as business stakeholders
has experience in:
- effectively communicating and coding in a remote work environment
- working with marketing, sales, and finance teams and understanding their target KPIs
- looker or equivalent data reporting platform
- sql – optimizing performance/transformation efficiencies on OLAP data warehouses
- dbt – our primary data transformation layer
- customer analytics tools like google analytics, amplitude
- marketing platforms like google ads, facebook business manager, etc (preferred)
- segment, airbyte or equivalent data-ingestion platforms (preferred)
- python – the core language of the data organization (preferred)
- aws or equivalent cloud environment (preferred)
- working in a startup environment (preferred)
Benefits
we can offer:
- starting salary: $100,000 - $125,000
- benefits & perks + unlimited pto + intentional culture
- really badass headshots


dataeurope
Europe-based applicants only
About Reedsy
We’re here to give authors the tools and resources they need to create beautiful books. Our marketplace gathers the industry's best publishing professionals — the likes of Neil Gaiman's editor, Nora Roberts' book marketer, and GRRM’s cover designer.
We’ve grown to a community of 1,000,000 authors and over 3,000 freelancers in just a few short years, while picking up a number of industry awards along the way 🏆
Want to hop on board and help us get to our next destination? Cool, keep reading!
Responsibilities
We have built a number of products at Reedsy, including a marketplace (https://reedsy.com), the world's largest short story contest, and a book discovery platform (https://reedsy.com/discovery). We have millions of users and always looking to reach more.
You will:
- Design our end-to-end data infrastructure and implement tools for data visualization;
- Use Machine Learning and NLP to automatically analyze and categorize projects submitted on the marketplace;
- Improve funnel conversion tracking through our entire ecosystem;
- Conduct data analysis to optimize liquidity (offer/demand) on our marketplace (think "Reedsy search engine optimization") and Discovery (book submissions/reviewers);
- Work with different functions in our organisation that have critical data insight needs (eg. our marketing team for lead acquisition and growth); and
- Pro-actively find insights that help us make better products.
Requirements
You have strong enough development skills that you are not reliant on our developers to set things up for you. You have experience with:
- Elasticsearch
- Google BigQuery
- Using query language (SQL and variations)
- Data Mining (web crawling, open data, parsing)
- Scripting and programming (Python, R, Ruby)
- Machine Learning, ideally with some knowledge about Deep Learning
- Strong writing skills and native-level English speaker.
- Experience working remotely
Bonus points
- Experience creating robust and elegant dashboards to present data;
- A good dose of ambition, curiosity, as well as a great sense of humour.
Benefits
- Work From Home;
- Paid paternal and maternal leave;
- Reedski, our annual ski trip
Salary range:
€70k – €100k

Who We Are
TetraScience is the Scientific Data Cloud company with a mission to accelerate scientific discovery and improve and extend human life. The Scientific Data Cloud is the only open, cloud-native platform purpose-built for science that connects lab instruments, informatics software, and data apps across the biopharma value chain and delivers the foundation of harmonized, actionable scientific data necessary to transform raw data into accelerated and improved scientific outcomes. Through the Tetra Partner Network, market-leading vendors access the power of our cloud to help customers maximize the value of their data.
Our core values are designed to guide our behaviors, actions, and decisions such that we operate as one. We are looking to add iniduals to our team that demonstrate the following values:
- Transparency and Context- We trust our people will make the right decisions and overcome any challenges when given data and context.
- Trust and Collaboration- We believe there can only be trust when there is transparency. We are committed to always communicating openly and honestly.
- Fearlessness and Resilience- We proactively run toward challenges of all types. We embrace uncertainty and we take calculated risks.
- Alignment with Customers- We are completely committed to ensuring our customers and partners achieve their missions and treat them with respect and humility.
- Commitment to Craft- We are passionate missionaries. We sweat the details, as the small things enable the big things.
- Equality of Opportunity- We seek out the best of the best regardless of gender, ethnicity, race, or age. We seek out those who embody our common values but bring unique and invaluable perspectives, talents and advantages.
What You Will Do
The TetraScience Professional Services Data Science Engineer will report into the Professional Services organization and will be responsible for the implementation of the TetraScience Data Platform for new clients. This entails execution on the project scope to ensure TetraScience meets or exceeds client expectations and fulfills their requirements.
Additional duties include research and prototype of data acquisition strategies for scientific lab instrumentation along with file parsers for instrument output files (.xlsx, .pdf, .txt, .raw, .fid, many other vendor binaries). Based on project scope, design and build data models, Python data pipelines, unit tests, integration tests and utility functions. As needed, build visualization, report, and dashboards using Spotfire, Tableau and Jupyter notebook for new clients.
Requirements
What You Have Done
- 2+ years’ experience working with in Python and SQL.
- AWS/GCP/Azure certification desired.
- Elasticsearch, science background, or experience with scientific instruments desired
- Experience with tools like Spotfire, Tableau, Jupyter notebook (any of them) desired
- Excellent verbal and written communications skills; ability to explain technical information in non-technical language to drive progress.
- Ability to manage multiple simultaneous projects, proactive troubleshooting skills and attention to detail.
- Bachelor’s degree in Computer Science, Statistics, Chemistry, Business or related field’ or equivalent work experience
Ideal if you have
- Passion about science and building solutions to make the data more accessible to the end-users.
- Intellectually curious: Unwavering drive to learn and know more every day.
- Ability to think creatively on how to solve project risks without reducing quality.
- Ability to distill and present complex information to a wide range of stakeholders.
- Team player and ability to "roll up your sleeves" and do what it takes to make the team successful.
Benefits
- 100% employer-paid benefits for all eligible employees and immediate family members.
- Unlimited paid time off (PTO).
- 401K.
- Flexible working arrangements - Remote work + office as needed.
- Company paid Life Insurance, LTD/STD.


datasouth africa
Main Purpose of Job
We are looking for a Data Engineer focusing on our Information Assurance offering. The Data Engineer will be responsible for project implementations based on solution design specifications and in accordance to our blueprint, standards and procedures. You will support our clients in their cloud migration journey as well as work toward identifying opportunities to solve complex data problems within the African markets and potentially beyond. The ideal candidate will have a keen interest in Big Data and Data Warehousing projects and Information Management as well as a good understanding of IM related configuration (SQL) and communication protocols.
You will be part of a passionate team as we undergo our agile transformation, focused on building a footprint as we expand our Big Data offering to new and prospective clients. You should be well versed in current and upcoming data tools and technologies. You should have a go-getter attitude and be willing to learn at all times. You should have the ability to effectively translate system requirements into technical requirements for the team. As the Data Engineer, you will be required to interpret and analyze data to perform solutions design. You will also have a large say in the roadmap going forward and how we approach it as we design our roadmaps from within the team.
Main Responsibilities
- Develop, construct, test and maintain data pipelines, Big Data platforms, Data Warehouses, and architectures
- Align architecture with business requirements
- Develop data set processes
- Identify ways to improve data reliability, efficiency and quality
- Conduct research for industry and business requirements
- Use Big Data sets to address product and business issues
- Prepare data for predictive and prescriptive modelling
- Identify hidden patterns using data
- Use data to determine tasks that can be automated
- In-depth Investigation of complex discrepancies and irregularities that arise from SLA reports and client escalations
- Report on all issues investigated to both client and internally
- Implement Projects and assigned task work within agreed timelines
- Conduct Interconnect Billing activities including loading of rate sheets
- Set-up and design of new reports and ad-hoc reports
- Assisting clients with ad hoc queries and requests in a timely manner
- Assist with daily reviews for clients, adhering to SLA timelines and escalating issues that may arise
- Mentor junior and other team members. Support efforts to cross-train staff to ensure proper redundancy and growth opportunities
- Tool maintenance (Data classification maintenance; maintaining existing reports; reference data maintenance, configurations, Record level matches)
- Review quality of monthly value-add reporting and ensure timely submission thereof
- Aggressively grow yourself within the role
KPI'S
- Project Configuration, Development & Deployment
- Change Management
- Internal & External Communication
- Career Development
- Operational Support
Qualifications and Experience
- 5 Years SQL and ETL experience
- 2 – 4 Years Big Data and AWS experience
- 2 – 4 Years Data Engineering and Information Systems experience
- Machine Learning exposure advantageous
- Tertiary qualification
- Experience in the following would be an advantage:
o Experience with BI and/or Data Engineering tools
o Knowledge of relational database and stored procedures
- Academic background in foundations of Data Engineering: Statistics, Mathematical modelling
Key competencies and skills
- You have strong analytical skills
- You are a Problem Solver
- Excellent written and verbal communicator
- Willingness to constantly update knowledge
- Ability to context switch, working on more than one issue and project simultaneously as we often work with many countries and cultures at once
- Able to carry out tasks with little or no supervision
Personal attributes
- You're able to work both in a team
- You are a collaborator
- Excellent communication skills and able to give valuable feedback to your team or the client
- Deep Analytical Thinking is your play-space
- You are able to Influence and Negotiate
- You have the ability to work well under pressure
- Maintaining logical and balanced approach
- Self-motivated and self-managed
- Able to use initiative and take ownership
- Attention to detail

DEVELOPMENT COORDINATOR
HOW YOU WILL MAKE A DIFFERENCE
MII’s Development Coordinator will be responsible for assisting in securing the essential funds that power our work to transform the materials industry by developing and supporting the ecosystem for sustainable and animal-free replacements for leather, wool, silk, down, fur, and exotic skins.
As MII’s Development Coordinator, you will support our development team and own the administrative processes of our three primary fundraising programs: leadership giving (major gifts); annual campaign; and corporate and foundation grant seeking. The Development Coordinator will work closely with the Chief Development Officer (CDO), other members of the Development Department, and the Finance Team in all development and fundraising endeavors to ensure the department runs smoothly and donors are stewarded properly.
While the Development Coordinator will eventually report to the Chief Development Officer and look at the team’s functions to create a well-oiled operation, they will be in service to each development team member, prioritizing tasks based on hours with the CDO as needed. The Coordinator will own the organization’s Development team files (e.g., Google Drive), technology (primarily the donor CRM), and processes and procedures for the team.
RESPONSIBILITIES
MAINTAIN ALL DONOR RECORDS
- Ensure donor management database is up to date and constituent records are comprehensive so as to provide accurate report generation and help ensure the organization’s fundraising success.
- Oversee and perform regular data entry into the organization’s donor CRM system.
- Create and maintain donor profiles, custom reports, and mailing lists to support the work of development staff and to help the Development Department achieve its goals for all campaigns and appeals.
- Conduct research on an as-needed basis for Development Team members to best implement the department’s priorities and to prepare for annual appeals and future campaigns.
- Ensure the donor newsletter (i.e., the Insider) subscriber list is accurate and up to date.
- Responsible for running the reports necessary from the database as requested by the team, including, but not limited to, assisting with the department’s metrics tracking.
SUPPORT THE DEVELOPMENT TEAM WITH DONOR PROSPECTING, CULTIVATION, AND STEWARDSHIP
- Regularly screen inbound donor prospects as needed through donor prospecting software and internet research and share findings with the Development Team.
- Support development department with booking travel for in-person cultivation, stewardship, and solicitation events.
- Oversee the printing and mailing (as needed) of development materials including, but not limited to, appeals, the annual report, banners and fliers for fundraising events, and hard copies of our other reports and materials for fundraising purposes.
- Assist with year-end donor report mailing, invitations, appeal mailings, and annual tax letters.
- Own the donor acknowledgment process, understand what is needed for each gift, keep the team following through on all gifts that come in, understand and follow best practices, and send hand-written thank you notes when applicable.
- Support the event manager with researching venues and vendors for in-person cultivation, stewardship, and solicitation events.
- Support with any other tasks the Development Team may need assistance with, as agreed upon.
MANAGE THE ADMINISTRATIVE NEEDS OF THE DEVELOPMENT TEAM
- Own the Development Team’s Google Drive, keeping the team organized and presenting to the team if the method of organization changes.
- Create and update as necessary the Development Team’s Standard Operating Procedures (SOPs), documenting the processes we already have in place, updating and communicating with the team when processes change, identifying whitespaces for better processes, and calling interview/brainstorming meetings with relevant team members for new processes we need to implement.
- Responsible for coding, managing, and generating reports for the Development Team as needed.
- Collaborate with teammates to assist them in their respective area’s work as appropriate, including but not limited to liaising about tech integrations between our various platforms (e.g., the website and the database), researching foundation prospects, or doing meeting note entry and tasks for team members and yourself in Asana.
- Assist the CDO in making the best use of Asana for personal and team efficiency.
- Work with the CDO to create synergy between the database and Asana or other tech platforms.
- Work with the CDO and COO to create systems for keeping track of the fundraising numbers on a regular basis, this includes but is not limited to pledge installments and fulfillments and YTD fundraising by category.
- Oversee the organization’s state charitable registration requirements via our Harbor Compliance service to ensure we are registered as fundraisers in each state in which we fundraise.
- Participate in all platforms of communication, most specifically email, Slack, Google Drive, and Asana.
- Assist in overseeing and working with Development volunteers.
- Participate in at least one 1 on 1 meetings per week with the CDO to prioritize based on a full or part-time schedule, align, and execute on tasks. (Please note these meetings will be the COO until the CDO is onboarded)
- Participate in Development and all team meetings.
WHO WE’RE LOOKING FOR
The Development Coordinator is the foundation by which we are able to fundraise and ultimately grow as an organization. Through process, organization, and creativity, the Development Coordinator helps us plan better for the future financially, have more time for field fundraisers to cultivate new donors, and ensures that all of our donors have a wonderful interaction with us at any point in their philanthropic process. It is an essential role in the organization, and the ideal candidate can identify with the following traits:
- A proven track record of managing a multifaceted development program.
- Experience working within a team and working independently.
- Is a problem solver, proactive, and is detail-oriented.
- Previous experience organizing and coordinating teams is a plus.
- Excellent written, interpersonal, and verbal communication skills.
- A capacity to work independently and effectively under the pressure of deadlines.
- Excellent research skills and an understanding of how all the development pieces fit together into the larger puzzle.
- Comfortable working remotely and in a growing and evolving organization.
- Strong support for MII’s philosophy and mission.
We want the best people and we don’t want biases holding us back. We strongly encourage people of every color, orientation, age, gender, origin, and ability to apply. Because we value a erse workplace, we prioritize an inclusive climate absent of discrimination and harassment during the application process and after you join the team. We encourage candidates with most of the above to apply; we do not expect all candidates to fit this job description 100%.
THE FINE PRINT
- Terms of employment: Full-time, exempt
- Reports to: Chief Development Officer (temporarily Chief Operating Officer)
- Location: Remote; applicants from around the world are encouraged to apply.
- Benefits: Working from home, flexible holiday and vacation plans, room for advancement, respectful managers, and supportive colleagues. HRA plan for US-based employees
- Salary: $50,000 $60,000 annually in the U.S. for full-time employment dependent on experience. Salary will be adjusted for international locations. A premium may be applied for higher cost of living areas.

datapuerto rico
Hello from tea!
The tools that build the internet have steeped too long. For the past two decades, big tech has made trillions off the generosity of visionary developers and web pioneers… never thanking, never mentioning, and certainly never paying. At tea, we're brewing something to change that by enabling developers (you) to continue doing what you love, while earning what you deserve.
Role
The Data Engineer role is focused on data management, augmentation, and visualization. It's a full-time remote contractor position.
Responsibilities
- Building an efficient data ingestion system to intake data from a variety of sources, primarily APIs
- Work with teammates to define data requirements for their projects
- Manage data schema and pipeline to provide the data format and availability as needed
- Develop data visualization tools for simulation modeling, with tweakable parameters for fast iterations
- Assist with developing model parameters for finding optimal simulation results
Qualifications
- Minimum 2 years of data engineering or similar role
- Minimum 2 years of TypeScript experience
- Minimum 2 years of experience with database schema design, preferably Postgres
- Professional experience with data visualizations using D3.js or similar
- Great communication skills, with the ability to clearly explain technical issues
- Strong attention to detail while maintaining the ability to manage and handle multiple tasks concurrently
Why work for tea?
Join a team of motivated and passionate iniduals and be a part of a rapidly growing and exciting project in open source and web3 technologies. We offer an attractive salary, relocation packages (if applicable), token/equity options and performance based bonuses.

< class="h3">Company Description

Yuxi Global is an American company with high functional teams across Latin America.
We stay updated with the most modern, edge practices and technologies. Our teams are versatile, adaptable and have expertise in a wide range of programming languages, databases and frameworks.
This is your invitation to someone who loves working with the newest technologies and is on top of what’s next. Do you have a keen eye for performance and are you always seeking the highest quality in code? We are looking for someone like you!
< class="h3">Job DescriptionCURRENT STACK & TOOLS
Cloud services: Azure, DataFactory, Databricks, and Messaging Queues.Languages: Python, PySpark, and SQL.Data Storage: Azure Data Lake Storage, BigQuery.Tools: GIT, Jira.Communication: Confluence, Teams. < class="h3">QualificationsWHAT YOU BRING TO CLICKDIMENSIONS
4+ years working with Power BI
5+ years demonstrated experience in a Data Engineer/Scientist role.2+ years of experience with Python, PySpark, Databricks, SQL is needed.2+ ML & MLOps experience is not a must but nice to have.Experience working with remote teams is a plus. < class="h3">Additional Information
< class="h1">Description

This is a remote position.
Position: Data Analyst Department: Solutions – Operations
Reports To: VP, Operations Date: November 2022
|
|
|
The Data Analyst is responsible for leveraging data and other resources to identify actionable trends and patterns that support business, technology and operations decision-making. Through a combination of deep domain expertise, expert product knowledge and strong data skills, the Data Analyst will turn data into information, information into insight, and insight into business decisions. The Data Analyst reports to VP, Operations. This is a full-time, exempt, remote position.
Specific Responsibilities:
§ Import, clean, transform, validate or model complex data sets from multiple sources in order to understand and answer business questions surrounding:
o Data quality
o Operational improvement opportunities
o Strategic planning
o Market insights
o Customer-facing business queries
o Usage metrics for organizational utilities
§ Identify root causes for problems related to adoption, usage, operations and quality, and identify improvements by pinpointing trends, correlations and patterns in complicated data sets.
§ Design and present data and conclusions in charts, graphs and tables. Maintain ongoing reports.
§ Develop and maintain data quality assurance measures and processes.
§ Build out and automate organization's internal and external reporting and analytics capabilities.
§ Recommend data migration approaches for product upgrades.
§ Develop and implement data collection systems and other strategies that optimize statistical efficiency and data quality.
§ Design and develop relational databases for collecting data; write Data Definition Language or Data Manipulation Language SQL commands.
< class="h3">Requirements
Skills:
§ Demonstrated self-starter and creative thinker who can handle a considerable amount of unstructured, real-world data and be comfortable thinking about solving problems using new and novel approaches.
§ Experience using one or more business intelligence software tools (Tableau, Cognos, Business Objects, etc.).
§ Knowledge of statistics and experience using statistical packages for analyzing large datasets (SPSS, SAS, Excel, etc.).
§ Knowledge of a querying language such as SQL and the ability to write code without depending on an intermediary, guided interface (drag and drop tool).
§ Ability to identify, analyze and interpret trends or patterns in complex data sets and present data in a way that is intuitive to technical as well as nontechnical users.
§ Excellent time management skills and the ability to prioritize a full plate.
§ Ability to work collaboratively and effectively across matrix organizations in a fast-paced, schedule-driven and entrepreneurial environment.
Experience:
§ Minimum of 4 years of hands-on experience in a Data Analyst role.
§ 2+ years of performing data analysis in support of a product organization.
§ Healthcare experience preferred (payer or revenue cycle management).
Education:
§ Bachelor’s degree required; Mathematics, Economics, Computer Science, Information Management or Statistics disciplines preferred.
Certification Requirements:
§ None.
< class="h3">Benefits
This organization recognizes that its most important asset is its growing team of smart, creative, collaborative, forward-thinking and passionate professionals – and that a comprehensive employee benefits package is an important factor for them in choosing where to work. This organization offers competitive compensation along with an extensive benefits package for all full-time employees, including medical, dental and vision coverage, tuition assistance and a 401k. We offer full-time remote work to all staff from any location and maintain a physical office (with many amenities) in downtown Washington, DC.
< class="h1">Skills
Skills: Demonstrated self-starter and creative thinker who can handle a considerable amount of unstructured, real-world data and be comfortable thinking about solving problems using new and novel approaches. Ability to identify, analyze and interpret trends or patterns in complex data sets and present data in a way that is intuitive to technical as well as nontechnical users. Ability to use critical thinking skills to propose solutions and think strategically. Experience using one or more business intelligence software tools (Tableau, Cognos, Business Objects, etc.) Knowledge of statistics and experience using statistical packages for analyzing large datasets (SPSS, SAS, Excel, etc.) Knowledge of a querying language such as SQL and the ability to write code without depending on an intermediary, guided interface (drag and drop tool). Excellent written and verbal communication skills. Must be comfortable with speaking in front of a wide variety of technical and business stakeholder audiences. Basic understanding of software development and relational data models. Ability to work collaboratively and effectively across matrix organizations in a fast-paced, schedule-driven and entrepreneurial environment. Excellent time management skills and the ability to prioritize a full plate. Intermediate to advanced skills with Microsoft Office, Word, Excel, PowerPoint. Experience with Customer Relationship Management platforms such as Salesforce is preferred. Percent of Travel: None. Experience: · Minimum of four years of hands-on experience in a data analyst role. · 2+ years of performing data analysis in support of a product organization. · Healthcare experience preferred (provider data or network management). Education: Bachelor’s degree required; Mathematics, Economics, Computer Science, Information Management or Statistics disciplines preferred.

The Data & Analytics Engineer will work directly with the Head of Engineering to improve and scale data infrastructure at Skiptown, the all-in-one ecosystem for pets and their people, assisting with Skiptown's growth as we build 50+ locations in the next 5 years.
As our Data & Analytics Engineer you will…
- Define the data pipeline architecture and build/maintain the data analytics infrastructure for the organization across our 5 enterprise and client-facing applications
- Collaborate with business leaders to build the roadmap for internal data products that support all departments (with a focus on Operations and Marketing)
- Produce data pipelines that allow for high levels of customized communication with our large clientbase through internal products and external marketing tools
- Create usable data products (dashboards, visualization tools, etc.) to support KPI tracking and team access to data insights
- Provide clean, transformed, and highly accurate data to be piped into relevant systems
- Support the two-way integration of external software and tools with our internal data
- Act as a Q/A tester for any large product releases
The ideal Data & Analytics Engineer is someone who…
- Lives and breathes data, and gets excited about data warehousing, creating innovative data solutions, and maintaining data quality
- Is eager to architect a data analytics infrastructure from the ground up
- Is proficient in selecting and implementing systems to support data analysis and pipeline development
- Has the skills to effectively communicate with technical and non-technical people
- Has an entrepreneurial spirit - you're comfortable with solving unfamiliar problems
Requirements:
- 3+ years of relevant work experience
- Experience with both SQL and NoSQL databases, warehousing technologies, ETL/ELT, and event tracking tools
- Experience building data pipelines from scratch and selecting the tools to support those pipelines
- Bonus: knowledge of dbt, BigQuery, Firebase, Google Analytics, and other similar platforms
- Startup experience preferred
Structure & Benefits:
- Prefer candidates based in Charlotte, NC or Atlanta, GA. Will consider remote work for the right candidate.
- Medical, Dental and Vision Insurance
- Unlimited PTO
- DoorDash DashPash
About Skiptown:
Skiptown is on a mission to make the lives of pets and their people easier, and even more fun, through a tech-enabled, premium pet services ecosystem, and state-of-the-art facilities. Our 24,000 sq ft flagship location in Charlotte offers dog daycare, boarding, grooming and a social bar - and will soon expand to include retail, veterinary, training and transportation services. Skiptown is well-funded and preparing to launch 50+ locations across the country in the next 5 years.

About Us:
Atmosera is full lifecycle cloud technology transformation firm, offering professional services, management, and training. Our expertise across the Microsoft Azure platform allows us to accelerate innovation speed, operational agility, and customer responsiveness for our clients across the country.
We are a leading Microsoft Azure Gold Partner with solution offerings across Application Innovation, Advanced Data & AI, DevOps and Cloud Infrastructure.
Our Data Practice at Atmosera is integral in helping our customers build their Data Estates on Azure. We specialize in everything from migrating existing data analytics platforms from on premises to Azure to architecting cloud scale analytics platforms from the ground up. We leverage our in-house expertise and best practices from infrastructure as code through inidual service configuration. We are one of the few Microsoft Gold Partners to achieve the Advanced Specialization Analytics on Azure.
Job Description:
The Azure Data Architect is a Senior Consultant and will drive high priority customer initiatives, leveraging Azure data services to solve the biggest and most complex data challenges faced by Atmosera's enterprise customers. This is a customer facing role, accountable for the end-to-end customer deployment and usage for solutions focused around Azure data services.
Responsibilities:
- As a Senior Data Consultant, you will be customer facing and provide recommendations and thought leadership in the areas of advanced data techniques, including data modelling, data access, data integration, data visualization, data discovery, and database design and implementation.
- Translate business requirements to technical solution by providing high-level architecture expertise directly with the customer and project teams. Provide documentation of solutions as well as recommended course of action for our clients.
- Design and implement logical and physical data models that are appropriate for the chosen cloud and/or hybrid data warehouse environment.
- Implement data architectures to support a variety of data formats and structures including structured, semi-structured and unstructured data.
Required Technical Skills:
- Deep technical knowledge of Azure services
- Databricks / spark development experience (python and/or Scala)
- Experience with multiple full life-cycle data warehouse implementations
- Understanding of data architectures that utilize Azure Data Factory and/or Synapse Pipelines
- Experience with the following Data Services on Azure: Azure Data Lake, Azure Storage Accounts, Azure Synapse Analytics, Azure Data Factory, Azure SQL database
- Demonstrated ability to quickly research, learn, adopt, and apply new technologies
- Data profiling and creation of source to target mappings
- Ability to provision and configure Azure data service resources
- Familiarity with Agile Methodology
- T-SQL & Python proficiency
Preferred Technical Skills:
- Experience with DevOps practices around Azure Analytics Services
- Experience with data modeling using Microsoft Visio or Draw.io
- Microsoft Azure Cosmos DB, Azure Networking, Azure Active Directory
- Experience creating and implementing strategies to migrate customers from on-premises environments to Azure
- Power BI and semantic modeling
- Experience with AI/ML concepts
- DP-203 certification
Perks & Benefits
- We invest in our resources, helping you to take your skills to the next level with yearly continued career development
- Health, vision, dental package
- 401k with employer matching
- Company paid Life, AD&D, Short- and Long-Term Disability
- Performance-based compensation with bonus potential in addition to salary
- Employee Recognition and Reward Programs
- Community Service Leave
This is a full-time position in the United States with the ability to work from home, or from one of our many US offices if local.
Atmosera is an equal opportunity employer. We celebrate ersity and are committed to creating an inclusive environment for all employees. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an inidual with a disability, or other applicable legally protected characteristics. All employment is decided on the basis of qualifications, merit, and business need.

The Company
Fuzzy is a health and wellness pet company building for the pet parent. We’re changing the way pet parents think about caring for their pets, looking to empower the pet parent by putting caring for our pets on autopilot. At Fuzzy, we work every day to make veterinary care more accessible. Of the 200 million dogs and cats in the US, more than 100 million receive no veterinary care.
Through technology we’re creating a different type of relationship between vets and pet parents - one that is personal, empowering, and for the long haul.
The Role
The Fuzzy Data Science team partners with people across the company to drive the business forward with data. We’re looking for help understanding our data, making predictions to improve our product and customer experience, and optimizing our processes.
We collect data from a wide variety of sources, and our ideal candidate for this role will be someone with experience building successful models while being data-driven, curious, and a clear communicator.
< class="h3">Responsibilities:

- Develop a thorough understanding of our data and interpret them as they relate to pet parent behavior, pet medical records, care delivery outcomes, and product development.
- Build predictive models and optimization routines, with applications ranging from text recognition and classification, care recommendations, churn prediction, and optimizing internal processes.
- Collaborate with other members of the Data Science & Data Engineering team on ways to approach problems, improve code, and share new techniquesWork with functional team leaders (in Product, Digital Health, Marketing, etc) to explain your analysis, findings, and recommendations.Find opportunities for improvement across the business and assist in decision-making across teams.
- 4+ years of experience with predictive modeling, machine learning, and data scienceStrong eye for detailHigh degree of proficiency with one of Python or R
- Proficiency with SQL
- Bachelor’s Degree in a quantitative fieldComfortable writing and debugging production code
- Experience deploying predictive models in a cloud environment (we use AWS)Experience with batch and real-time compute models
- Experience with BI and Analytics tools (we use Tableau and Amplitude)

Job Description:
TableCheck, Japan's leading restaurant reservation management platform, is searching for a skilled Data Operations Engineer for our Data Operations Team. As a member of this team you will own our data-related products and help support our demanding business and developer needs.
We are working on significant improvements to our Data Operations by utilizing new operations frameworks and tooling.
TableCheck has embraced remote work. As such, communication and documentation are in our blood. We emphasize best practices such as agile, scrum, automation, and customer-centric improvements. We look for people who constantly learn from mistakes and adapt, and we expect members of our teams to constantly follow up with questions and updates to keep everyone in the loop.
Our new software stack is being built with the latest technologies including dbt, Airflow, and Broadway. We primarily deploy our services using AWS.
Responsibilities include:
・Implementing source management through data generation and schema management.
・Define data ingestion pipelines that normalize and transform clients data to fit our standard data schema. ・Ensure data meet quality standards and routinely work with our customers, handling their requests and configuring the platform to their needs. ・The drive to collaborate, gather feedback, solve problems, and tackle challenges through test and learn is highly valuable in this position.
Mandatory Skills
・Experience with database technologies such as PostgreSQL
・Experience with event streaming technologies such as Kafka・Experience implementing the Data Engineering Lifecycle in an organization (both code and communication)・2-3 years of experience in a data scientist / data analyst / full-stack programming with Big Data or similar position・Proficient with SQL, Python scripting and Notebooks・Familiar with machine learning concepts・Demonstrated ability to absorb information, manage your own time and commitments, achieve desired results in a highly technical environment・A fluent level of English
Recommended Skills
・Previous startup experience is highly desired
・Experience with AWS (API-level)・Experience with MongoDB・Security related development knowledge, especially related to PII・Bachelor's degree in Engineering, or a related technical field
Language Skills
・A native level of English is required.
(No Japanese language knowledge is required for this role.)
Evaluation Criteria
We will evaluate candidates based on the following stages:
・Initial interview - a one-on-one 15 minute chat over Google Meet to see if we're the right fit
・Technical interview - (virtually) meet the SRE team at TableCheck to evaluate your skills (no whiteboard or materials required)・Take-home project - we will provide you with a 30-60 minute project, which will evaluate your dev and ops skills
We're looking for a Performance Analytics Manager to join our team. This team leader position plays a critical role in development and execution of performance analytics strategy, delivery of our performance analytics services, and data storytelling to both internal and client stakeholders. The ideal candidate is equal parts creative data strategist, passionate people leader, and hands-on practitioner and brings 5+ years of experience and technical expertise to the table.
As the Performance Analytics Practice Area Leader:
- Provide leadership and direction to the Performance Analytics team; own service delivery, quality, and team structure, as well as team performance, utilization, and growth.
- Develop departmental and integrated (department's role within the agency) vision and plan(s) to accomplish goals.
- Drive quality and consistency in team operations, processes, and deliverables through development and ownership of SOPs, templates, process mapping, and similar documentation.
- Define and scope performance analytics services to support client strategy and goals, estimate required level of effort, and manage work assignments to maintain a healthy level of team utilization.
- Ensure cross-functional understanding of performance analytics services and value.
- Lead the fulfillment of current/upcoming analytics obligations.
- Work with key stakeholders (department leads, new business team, client account management team) to catalog, package, and sell performance analytics services.
- Work collaboratively with other value delivery team members to support proper performance analytics setup and data collection for measurement of paid media, SEO, CRO, and other relevant integrated marketing strategies.
- Serve as a thought leader and subject matter expert on the performance analytics landscape, evolving strategies and approaches, and best practices, both internally and with clients.
As an inidual contributor:
- Develop client performance analytics strategies providing creative, effective solutions to challenging business problems.
- Deploy and maintain tracking code for clients using Google Tag Manager (or other tag management solutions) and troubleshoot and resolve issues as needed.
- Audit both strategic and technical aspects of client analytics setups; document and implement analytics solutions & updates.
- Perform thorough QA on work completed by other team members.
- Build reports, dashboards, and data visualizations, applying best practices and leveraging a erse range of platforms.
- Draw actionable insights from performance data and develop/present strategic analytics recommendations to clients.
Required skills and experience:
- Strong organizational, prioritization, and collaborations skills and the ability to successfully navigate ambiguity.
- Ability to influence and inspire trust among team members and internal and external stakeholders.
- High degree of comfort in client-facing situations, including participation in the sales process as a subject matter expert.
- 5+ years of experience in an analytics role in an agency environment, preferably with at least 2 years of direct team leadership experience.
- Advanced knowledge and hands-on experience (certification preferred) with each of the following platforms:
- Web analytics platforms, including GA3, GA4, Mixpanel, Adobe Analytics, Webtrends, and Funnel Metrics (experience with GA4 implementation required)
- Google Tag Manager (and other tag management solutions)
- Google Data Studio/Looker Studio, Tableau, Microsoft Power BI
- Google Optimize
- Segment
- Experience with advanced analytics, such as:
- Custom Segmentation
- Multi-channel funnels
- Multi-variate testing for creative, incrementality, conversion optimization, and other performance areas
- Deriving insights from large data sets using BigQuery
- Data ingestion and manipulation working with large, complex datasets and dimensional and relational databases
- Deployment of predictive modeling, machine learning, and other advanced technologies to drive actionable marketing insights (preferred but not required)
- Proficiency in applicable development languages:
- Advanced SQL/Python
- JavaScript (including GA.js, analytics.js, jQuery)
- VBA / Excel Macros
The culture and perks of being a Tallwaver:
- We embrace a maverick attitude. We're willing to challenge convention and think beyond the ask.
- We are committed to relentless pursuit. We're curious with a focus on personal growth.
- We value selfless teamwork, demonstrating strong leadership qualities with a knack for collaboration.
- We are thoughtfully rigorous, have a strong work ethic and have strong attention to details.
- An environment that fosters empathy and compassion for each other and our clients.
- A erse (and competitive, and hilarious, and sometimes a little crazy) group of coworkers. You know, the kind you actually want to grab a drink with after work.
- Career growth and opportunity. We grow leaders here.
- We recognize the contributions of iniduals and teams throughout Tallwave and we stop to celebrate through Happy hours, on-the-fly high fives, core-values-based awards during our All Hands Meetings, and Tallwave anniversary awards, to name a few.
- Catered “Lunch & Discuss” sessions, and formal training sessions focused on continuous learning and growth as a team.
- We offer comprehensive Medical, Dental, Vision, Life, and 401(k) plans.
- Concierge-booked Tallwave Work Anniversary experiences.
- We provide employer-paid short-term disability and long-term disability plans and life insurance.
- We're more concerned about how we work than where we work. We offer remote work and office hours flexibility as needed.
- We trust our team -- we have a generous PTO policy. Seriously, did you read about it yet? It's kind of amazing!
- We are surrounded by badass talent and clients -- our work has been featured in TechCrunch, Forbes, Wired, Product Hunt, and more.
- We are featured on the 2020 INC Best Workplaces List, the 2020 Phoenix Business Journal Best Places to Work List, and the 2021 AZ Central Top Places to Work List.
Why you'll love working with us:
Free to be you, each and every day! We celebrate ersity and empower you to bring your authentic self to work, always. We know that ersity, equity, and inclusion is not about checking a box. It's more than words, it is a pledge and a promise that Tallwave is passionate about pursuing. We recognize having an all-encompassing culture will help us learn and grow collectively, and continue to foster our mission of making everyone we work with wildly successful.
Working conditions:
In office, employees operate in a professional office environment and routinely use standard office equipment like computers, telephones, and teleconferencing equipment. The office is an open floor plan so employees can communicate with everyone around -- no cubicles at Tallwave.
Physical requirements:
This job may require sitting for long periods of time, and minimal (if any) lifting. Some travel may be required.
Position Applicants, please note:
At this time, this position is open to U.S. citizens only who live in the U.S. We are also unable to provide visa sponsorship of any kind, so please do not apply if you will require any kind of visa sponsorship, or are not a U.S. citizen living in the 48 contiguous states, Alaska, or Hawai'i.
Tallwave is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.
#LI-Remote

About us:
Fieldguide is establishing a new state of trust for global commerce and capital markets through automating and streamlining the work of assurance and audit practitioners specifically within cybersecurity, privacy, and ESG (Environmental, Social, Governance). Put simply, we build software for the people who enable trust between businesses.
We’re based in San Francisco, CA, but built as a remote-first company that enables you to do your best work from anywhere. We're backed by top investors including 8VC, Floodgate, Y Combinator, DNX Ventures, Global Founders Capital, Justin Kan, Eric Ries, and more.
We value ersity — in backgrounds and in experiences. We need people from all backgrounds and walks of life to help build the future of audit and advisory. Fieldguide’s team is inclusive, driven, humble and supportive. We are deliberate and self-reflective about the kind of team and culture that we are building, seeking teammates that are not only strong in their own aptitudes but care deeply about supporting each other's growth.
As an early stage start-up employee, you’ll have the opportunity to build out the future of business trust. We make audit practitioners’ lives easier by eliminating up to 50% of their work and giving them better work-life balance. If you share our values and enthusiasm for building a great culture and product, you will find a home at Fieldguide.
Responsibilities
-
Analyze large amounts of information to discover trends, patterns, and insights
-
Leverage data insights to enable strategic decision-making
-
Undertake preprocessing of structured and unstructured data
-
Present information using data visualization techniques
-
Build and improve data analytics tools stack (BI/Reporting, ETL, Data Warehousing)
-
Design and prototype new customer-facing analytics features for Fieldguide’s products
-
Collaborate cross-functionally with engineering, product, customer success, go-to-market, leadership, and external partners/customers
Skills
-
Proven experience as a Data Analyst or Data Scientist
-
Deep knowledge of SQL; familiarity with Python, R or Java is an asset
-
Experience using data visualization and reporting tools (e.g. Tableau, PowerBI)
-
Technical expertise regarding data models, database design development, data mining and segmentation techniques
-
Analytical mind and business acumen
-
Problem-solving aptitude
-
Excellent communication and presentation skills
-
BSc/BA in Computer Science, Engineering, Math/Statistics, Business, or relevant field; graduate degree in quantitative field is preferred
More about Fieldguide:
Fieldguide is a values-based company. Our values are:
-
Fearless - Inspire & break down seemingly impossible walls.
-
Fast - Launch fast with excellence, iterate to perfection.
-
Lovable - Deliver happiness & 11 star experiences.
-
Owners - Execute & run the business with ownership.
-
Win-win - Create mutual value & earn trust for life.
-
Inclusive - Scale the best ideas with inclusive teams.
Some of our benefits include:
-
Competitive compensation packages with meaningful ownership
-
Unlimited PTO
-
Wellness benefits, including a bundle of free therapy sessions
-
401k
-
Technology & Work from Home reimbursement
-
Flexible work schedules

As part of our international IT team, you will work with our ePoint platform - a web application developed by E&C for keeping track of global energy portfolios.
As a PowerBi Developer, you will be the person who further develops interactive ways of presenting energy data – simply reinventing the way energy data is seen to make it better & smarter than ever before.
In more details, you will:
- work in agile teams to develop visualization parts of new user stories for our ePoint platform
- work together with our database and data warehouse developers to make all necessary data connections and set up data marts so that a custom report gets filled with basic data
- create all necessary output tables and graphs in our ePoint platform
- develop tailor-made dashboards and graphs on energy data portfolios
- create and manage automations of data presentation and its modern visualizations
- give guidance to the Data Reporting team to make sure reports are updated and published and/or sent out on time
- administer Power Bi tools
What do we offer you?
- flexible hybrid/remote work mode possible after initial onboarding
- balance days resulting in 46 days off / year
- challenging and interesting environment in the continuously evolving energy sector
- international work environment with more than 25 nationalities
- flat company structure offering autonomy and empowerment from day 1
- regular discussions on your role(s) and competencies with your team leader
- a competitive salary with 3year progression plan
To make the happen it would be great if you:
- are Power BI wizard (PL-300 certificate would be a big plus)
- have comprehensive knowledge of data design, data modelling, data management, and data visualization
- gained experience in creating data-rich dashboards, writing DAX expressions, and implementing row-level security in Power BI
- know how to convert data into meaningful stories
- interacted closely with business before


location: remoteus
Enrollment Specialist – Remote
REMOTE
Nashville, Tennessee, United States
Sales
Description
CareHarmony Overview
CareHarmony is a venture-backed start-up; formed to help physicians thrive in the new era of value-based care by combining the latest advancements in population health technology with 24/7 care coordination services.
We are on a mission to heal healthcare and set the benchmark for managing chronically ill patients.
An Enrollment Specialist is a member of the Patient Engagement Specialist(PES) team, telephonically educating and enrolling patients in a Medicare-sponsored care coordination program designed to help the patient better manage their chronic illnesses (diabetes, high blood pressure, COPD, etc.).
Position Details/ Responsibilities:
You will work from an assigned patient pool, making outbound calls and educating patients on the advantages of our care program.
You will be responsible for establishing rapport with potential patients, overcoming objections to participation, and tailoring your messaging to patients based upon their demographics, unique health concerns, and insurance.
Requirements
- Excellent communicator with the ability to explain the program offered in an informational, influential, concise, and personable manner
- Proven ability be able to work within a team dynamic and be a leader
- Articulate a warm and professional etiquette when speaking on the phone to patients
- Possess high-quality data entry skills with the ability to multi-task
- Natural ability to express empathy with a patient-focused mindset and engagement
- Ability to be creative in delivering education to patients, while tailoring the showcase the benefits of the program
- Proven adaptability in a high-volume sales space, ability to meet deadlines, and metrics
- Comfortable with working within different software platforms (Microsoft Office: Outlook, Excel, Word, Skype)
- Will be able to consistently deliver high call volume, spending in excess of 80% of your time on the phone.
- Availability during business hours (8 am – 4:30 pm – CST.)
- Sales acumen is preferred, but not required
- Associate degree or higher
- US Based
Benefits
Competitive Benefits Package
Full Time, 100% Remote
Advancement Opportunities

location: remoteus
Data Entry Specialist
Summary
Title:Data Entry Specialist
ID:1434
Location:Remote
Description
Would you like a job that does not involve a hair-net or a cash register and where your almost-compulsive attention to detail is a huge plus? If these questions have peaked your interest, we may have the perfect job for you!
Nova 401(k) Associates is looking to fill several Data Specialist (Data Entry Clerk) positions (part-time or full-time options available). Although industry experience is welcome, it is not necessary, and paid training will be provided. This position can be done in our Houston office or at your home. This is a great position for stay at home parents looking for a job to pass the time while their children are in school, college students, entry level iniduals, iniduals returning to the workforce, iniduals looking to get out of restaurant and/or retail employment and many others! Nova 401(k) Associates is a rapidly growing national retirement administration firm headquartered in Houston, Texas. Nova provides consulting and pension administration services for all types of qualified retirement plans. Job Responsibilities:- Provide data entry assistance for the Distributions Department
- Complete daily goals of assigned tasks
- Execute assigned distribution pre-work
- Submit requests to recordkeepers timely
- Upload and submit forms to our data software
- Perform other duties as required
Qualifications:
- Minimum of high school diploma or equivalent; Some college or professional training is preferred, but not required
- Prior work experience preferred, but not required
- Ability to maintain confidential information
- 45+ WPM typing speed preferred
- Superior organizational skills
- Proficiency with MS Excel
- Clean background check
- Must maintain a professional appearance when working in the office and/or participating in virtual team calls
- For iniduals working at home, an appropriate home workspace is required
- Ability to work a regular, fixed schedule with a significant part of the schedule being between the hours of 8:00am and 3:00pm CST Monday Friday; Work schedule is subject to Team Leader approval
Other things to know:
- Part-time work needs to be between 20 and 29 hours per week.
- For iniduals working from home, we will provide a computer setup. The computer provided may not be used for personal use and must be promptly returned upon termination of employment. (Reimbursement of any associated shipping cost will be provided). A wired (not wireless), fast and stable internet connection of sufficient speed will be required. DSL and satellite internet are not acceptable.
- Time off (unpaid or paid dependent upon part-time or full-time status) is limited to two scheduled consecutive work days for the first three months and during peak times.
- Additional work hours may be available on a voluntary basis during peak times.
- Benefits depend on whether the position is part-time or full-time. For full-time employees, we do offer a broad array of benefits.
- We are pleased to provide advancement opportunities for all of our employees which may include promotions, rotations to other departments or moving from part-time to full-time status. Advancement from this position will be easiest for candidates who have a four year college degree, who are bi-lingual and/or who have excellent communication and customer service skills.
Are you excited about leveraging data and complex algorithms to improve business decisions and drive business value, helping to build the No. 1 digital lender in Nigeria and other emerging markets? Then, we’re looking for you!
Your role is Lead Data Scientist in the Data Science/Risk team working from offices in Paris, Riga, Lagos, and Bangalore as well as remotely across the globe.
Your mission is to develop data science-driven algorithms and applications to improve decisions in business processes like risk and growth, offering the best-tailored credit services to as many clients as possible.
- Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
- Mine and analyze data from company databases and external data sources to drive optimization and improvement of risk strategies, product development, marketing techniques, and other business decisions.
- Assess the effectiveness and accuracy of new data sources and data gathering techniques.
- Use predictive modeling to increase and optimize customer experiences, revenue generation, and other business outcomes.
- Coordinate with different functional teams to make the best use of developed data science applications.
- Develop processes and tools to monitor and analyze model performance and data quality.
- Apply advanced statistical and data mining techniques in order to derive patterns from the data.
- Own data science projects end-to-end and proactively drive improvements in both data science and business processes.
Requirements
- Strong background in Mathematics / Statistics / Econometrics / Computer science or related field.
- 5+ years of work experience in analytics, data mining, and predictive data modeling, preferably in the fintech domain.
- Being best friends with Python and SQL.
- Hands-on experience in handling large volumes of tabular data.
- Strong analytical skills: ability to make sense of a variety of data and its relation/applicability to a specific business problem.
- Feeling confident working with key Machine learning algorithms (GBM, XG-Boost, Random Forest, Logistic regression).
- Experience in building and deploying models around credit risk, fraud, collection, and growth.
- Strong focus on business impact and experience driving it end-to-end using data science applications.
- Strong communication skills.
- Being passionate about all things data.
< class="h3">Our tool stack
- Programming language: Python
- Production: Python API deployed on Amazon EKS (Docker, Kubernetes, Flask)
- ML: Scikit-Learn, LightGBM, XGBoost, shap
- ETL: Python, Apache Airflow
- Cloud: AWS, GCP
- Database: MySQL
- DWH: BigQuery
- BI: Tableau, Metabase, dbt
Benefits
- Paid Time Off (25 days Vacation, Sick & Public Holidays) to all B2B contractors and employment staff.
- Family Leave (Maternity, Paternity)
- Training & Development budget
- Paid company business trips (not mandatory)
- Contract: permanent
- Location: any, within 3 hours difference from CET.
- Remote work: any combination of remote / office work is acceptable.
- Screening call with Talent Manager.
- Test assignment and technical interview with data science team leads.
- Interview with the team and key stakeholders.

Founded in 2015 and rapidly expanding, Fresh Gravity (www.freshgravity.com) is a business and technology consulting company at the cutting-edge of digital transformation. We drive digital success for our clients by helping them adopt transformative technologies that make them nimble, adaptive and responsive to their rapidly-changing business needs. Our unparalleled digital transformation expertise combines business strategy prowess with digital technologies know-how. Our expertise includes Data Management, Artificial Intelligence, Data Science & Analytics, and API Management & Integration.
In a short time, we have crafted an exceptional team who have delivered impactful projects for some of the largest corporations in the world. We are on a mission to solve the most complex business problems for our clients using the most exciting new technologies. And we are looking for top talent to join us in our quest.
Fresh Gravity’s team members are authorities in their field, but know how to have fun, too. We’re building an inspiring, open organization you’ll take pride in. We challenge ourselves to grow – every day. We create value for our clients and partners – every day. We promise rich opportunities for you to succeed, to shine, to exceed even your own expectations.
We are thoughtful. We are engaged. We are relentless. We are Fresh Gravity.
JOB DESCRIPTION
We are looking for a Reltio Architect/Senior Developer, to drive a Reltio Master Data Management (MDM) implementation. The candidate is expected to be able to interact with business subject matter experts, technical IT support personnel representing cross functional business applications and manage internal team efforts in troubleshooting and remediating production issues. The candidate will be responsible for designing and building the Reltio MDM platform.
In addition, the candidate will be involved in other internal company initiatives including pre-sales solutioning, Reltio based solution asset development and other data management efforts.
Examples of day to day job functions include:
- Serve as a solution architect in identifying improvement opportunities and designing innovative solutions to improve end-to-end mastering of data
- Drive the end-to-end solution architecture of MDM solutions including data model definition and refinement, data quality assessment and remediation design, data migration strategy, data stewardship process and system interaction design
- Design and coordinate implementation of automated system monitoring and health check solutions
- Support the translation of business requirements into techno-functional requirements
- Coordinate with remote/offshore team members on design and development efforts on break-fixes and enhancements
- Support development of performance reports on managed services KPI SLAs
- Manage client communications and coordinate troubleshooting sessions with cross functional/application teams
- Prioritize and plan support service efforts to address incidents, service requests, change requests and enhancements
- Manage and support test and production environment code/build releases
- Support business operations by providing explanations on items that might need clarifications as well as provide general business operations support to ensure customer issues are addressed when needed.
- Participate in internal Fresh Gravity activities including project implementation estimations, pre-sales efforts, responding to RFPs, designing and building industry solutions, mentoring and guiding junior technical developers and competency development within the Data Management capability.
Requirements
Technical
- 5+ years’ experience working on MDM & data management related projects of which at least 2 years of Reltio implementation/support experience
- Must have good understanding of Master Data Management (MDM) architectures and business processes
- Must have hands-on experience with data modeling, data mapping, data validation, match and merge rules, survivorship rules, UI/dashboard design and configuration and Reltio REST APIs
- Experience with data profiling and leveraging the results of profiling in the design of the MDM solution
- Experience in operationalizing MDM solutions, job scheduling, workflow management, alerts and notifications
- Familiarity with Reference Data Management, data transformation, data cleansing and standardization
- Demonstrated experience and understanding of data integration, data quality, data architecture and Master Data Management project life cycle phases, best practices and processes
- Understanding of various source system applications and data nuances in healthcare and life sciences solutions
- Experience integrating Reltio with SnapLogic is a big plus
- Experience providing operations/production support highly desirable
Other
- Experience working in a fast-paced environment with competing priorities, while effectively managing time and workload expectations
- Experience working with Reltio Product Support/Professional Services teams and Reltio documentation to resolve complex client requirements and product issues/limitations
- Excellent communication skills and capability to engage with clients and stakeholders to drive MDM design-decisions, project collaboration, inter-team dependency management, etc.
- Ability to define and develop estimates, collaborate on project schedule, and contribute to project plan through documentation of tasks and requirements
- Ideally, based in the Dallas, TX area or able to travel to the area (or other areas) on an occasional basis.
- Preference is for full time but open to a contract arrangement
*Please indicate salary expectation when submitting your application - otherwise the application will not be reviewed*
Benefits
In addition to a competitive package, we promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. In keeping with Fresh Gravity’s challenger ethos, we have developed the 5Dimensions (5D) benefits program. This program recognizes the multiple dimensions within each of us and seek to provide opportunities for deep development across these dimensions. Enrich Myself; Enhance My Client; Build my Company, Nurture My Family; and Better Humanity.


dataeurope
SlashData is the leading research company in the developer economy: We help the world understand developers and developers understand the world. We survey 40,000+ developers annually - across mobile, IoT, desktop, cloud, AR/VR, web, games, data science, and machine learning - to help clients such as Microsoft, Facebook, Google and Amazon understand who developers are, what tools they love or hate and where they are going next.
We’re now looking for a full-time Data Analyst to help drive insights out of a wealth of data points relating to developer activity. You’ll be working based anywhere in Europe / Greece, working remotely.
Requirements
Who we're are looking for
We are looking for a Data Analyst with at least 3 years of experience in data analysis to join our fast-growing company with the ability to undertake many high-profile data analysis projects.
You will be naturally curious and inquisitive and demonstrate a real passion for data and analytics, have an eye for detail and the aptitude to learn new technologies and skills.
What you'll be doing
- Analysing complex survey data using Python, pandas, numpy, and our custom-built Python data analysis library.
- Handling client’s business questions and data requests, by preparing data tables and charts that serve their research needs.
- Building interactive data dashboards on our home-grown data visualisation platform.
- Supporting senior research consultants and other internal teams (sales, developer outreach etc.) with data requests as needed.
- Shaping the future of our in-house data analysis tools by actively recommending and/or contributing to improvements and new features.
The skills we are looking for
- Fearless in data analysis and number crunching.
- Background in statistics, mathematics, engineering or other quantitative field.
- Strong Python skills, with a focus on pandas and numpy.
- Advanced Excel skills (not our primary data analysis tool, but we use it occasionally).
- At least a basic grasp of version control and git.
- Ability to work on multiple tasks simultaneously and manage your time effectively.
- Problem solving and analytical skills.
- Excellent organisation skills and attention to detail.
- Minimum of 3 years experience as a data analyst.
- Excellent English language skills (spoken and written).
Bonus points for
- Advanced version control and git skills.
- Past experience of working with JSON files.
- Past experience of working with complex survey data.
- Data modelling.
- A passion for technology, and an understanding of how it’s impacting people’s lives.
Key success metrics
You will be successful in the role if in the first 6 months you have:
- Become a dependable team member.
- Fully familiarised yourself with our survey datasets and our internal tools for data analysis and visualisation.
- Delivered at least 12 sets of data tables and/or graphs in response to client requests.
- Built at least 6 interactive data dashboards.
Benefits
What we offer
- Opportunity to make a difference as part of the leading research company in the developer economy
- Bonus based on performance and company profitability
- Flexible working environment
- Annual training budget to develop your skills and career
- Monthly book allowance from Amazon, on any book you like
- Spotify Premium subscription or Netflix
- Annual co-working budget
- Initial office set up budget to ensure you have the right equipment
- Attendance at our bi-annual team event
- A fun and exciting work culture with regular opportunities to connect
- Never work on your birthday
- No dress code
- Part of an entrepreneurial company that's raising the bar, and calling the trends of the developer economy
Would you like to join our team? Apply today with your CV and cover letter to support your application.

At SparkMeter our mission is to electrify everything and everyone. We help utilities and governments make the transition to electrification while providing affordable, reliable, clean power. We don’t care where in the world you are or how big or small you are. We are particularly interested in solving these problems in underserved markets that are served by outdated incumbents.
SparkMeter is searching for a Senior Data Engineer to lead the design and development of analytics and digital modeling solutions for the utility industry. As a Senior Data Engineer you will be responsible for developing data integrations and analytics capabilities within the SparkMeter platform and product. This role is a key member of our engineering team and will report directly to the VP of Engineering.
What you will be doing:
- Design and implement data ingestion capabilities for system integration
- Develop analytical data models
- Develop data models for digital modeling of physical utility systems
- Develop ETL code for populating data models
- Devise end to end strategies for embedding analytics capabilities into SparkMeter products from ingestion through visualization
- Deploy and integrate with data visualization solutions
- Develop machine learning models for predictive analytics
- Work closely with product managers to understand customer requirements
Requirements
Requirements & Skills:
- Experience developing and supporting a commercial analytics product
- Experience with coding in Python, R, Pig script, and SQL
- Experience with performing data analysis, data ingestion and data integration
- Experience developing ETL (Extraction, Transformation & Loading) and architecting data systems
- Experience with schema design and data modeling
- Experience in writing, analyzing, optimizing and debugging SQL queries
- Basic understanding of various Big Data technologies
- Solid communication and collaboration skills
- Passionate and self-motivated about technologies in the Big Data area
- Bachelor's or Master's degree in Computer Science or related fields or equivalent practical experience
Nice to have:
- Utility industry experience
- Demonstrated data science or quantitative analysis experience
- Demonstrated experience handling terabyte size datasets, applying statistics and machine learning techniques and algorithms, and using visualization tools to present data.
- Understanding of deep learning, or distributed computing (Hive/Hadoop).
- Experience developing machine learning models or predictive analytics
- Experience building data integration and analytics solutions using the AWS platform and services
Where you will work:
- This is a 100% remote position, however we are headquartered in Washington D.C. and have an amazing downtown office that you are welcome to use. The team will also meet there periodically (likely quarterly), so some limited travel will be required.
- You will report to the VP of Engineering who is in Denver, Colorado USA
Benefits
- 401k with match
- Annual bonus
- Equity
- Health insurance, with 100% of the premium covered for you, and 30% for your dependents
- Paid vacation
- 13 paid holidays

< class="h3">ABOUT US:

Built around our mission, we started Athletic Greens more than a decade ago as a way to bring comprehensive and convenient daily nutrition to just about everybody. Originators of the Essentialist Nutrition movement, our philosophy is to focus on a very small number of products based on what the latest science indicates is essential to human health.
AG1 is made from the highest quality ingredients, in accordance with the strictest standards and obsessively improved based on the latest science. 75 vitamins, minerals, and whole-food sourced nutrients in one convenient daily serving. Optimized for athletes, life-letes, and everyone in between. Vegan, Paleo, and Keto-friendly. One scoop or travel packet, 8 ounces of water. Every day. That's it.
Our mission is to empower people to take ownership of their health.
< class="h3">THE ROLE:
We are looking for a data-driven, detail-oriented Customer Happiness Analyst, Reporting who will report to our Customer Happiness Global Workforce Manager. With our customer at the center of everything we do, this critical role will ensure that data is being used to guide decision-making, process improvements, KPI tracking, and proper reporting. This person will provide key insights, enabling senior leaders to make sound decisions and provide strategic direction to the organization. With partnership and support from AG’s Operations, Tech & Customer Happiness Teams, our empowered Customer Happiness Analyst will serve as AG’s primary contact point for reporting & data in the Global Customer Happiness team.
< class="h3">WHAT YOU'LL DO:
- Design and implement methods to gather and manage Customer Happiness data.
- Develop and monitor all relevant dashboards and reporting of identify key trends.
- Identify issues and make data-driven recommendations that will drive necessary change.
- Report KPIs weekly and update dashboards.
- Ensure data integrity by consistently monitoring, scrubbing and refining Customer Happiness data.
- Identify, investigate and resolve data issues and discrepancies, ensuring that the proper measures are in place to limit inaccuracies.
- Lead projects for the Customer Happiness team related to the gathering and distribution of data; provide meaningful recommendations and insights to guide future projects and initiatives.
< class="h3">WHAT WE'RE LOOKING FOR:
- BS in Mathematics, Economics, Computer Science, Information Management or Statistics.
- 3+ years of analytic and reporting experience in contact center setting, preferred.
- Strong understanding and applicable experience working with data models, database design development, data mining and segmentation techniques.
- Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks).
- Project management experience preferred.
- Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS, etc.).
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Adept at queries, report writing and presenting findings.
- Strong critical thinking skills. You can identify opportunities, anticipate challenges and gaps, and contingency plan accordingly.
- Strong interpersonal, written communication, and organizational skills.
- Ability to work independently and take ownership for outcomes while prioritizing and juggling multiple projects.
- A full life outside of work with personal passions and hobbies!
< class="h3">WHAT'S IN IT FOR YOU?
- A 100% remote working environment, which has been implemented from day one.
- An immensely strong company culture that is enforced through the hiring process ensures values alignment and a highly collaborative team.
- 90% premium coverage for top medical, dental and vision plans.
- Employer paid short-term disability and life insurance.
- A mission-driven approach to each and everything that we do, with an overall goal to significantly improve our customer's health and wellness.
- Paid maternity and paternity leave to allow you to spend time with your new family.
- A monthly fitness class credit to support you on your health and fitness journey.
- Annual company-wide retreats and quarterly department-wide meet-ups (COVID permitting).
- A 401k plan with company-matching because life is an opportunity and we care about your future.
- Paid holidays and an unlimited vacation policy to ensure you have a healthy work-life balance.
- The opportunity to work with passionate, high-growth, business-minded colleagues.
- A brand new company laptop and a generous home-office budget so you can work your best.
- Access to Athletic Greens product and swag items.
Athletic Greens is an Equal Opportunity Employer. We are committed to inspiring fulfilling lives, starting with a focus on health and we believe this starts in the workplace. We do this by providing employees with a safe and welcoming work environment free of discrimination and harassment. We strive to create a erse & inclusive environment where all team members can thrive, feel a sense of belonging, and make a difference in the world together. We do not discriminate based on age, race, color, religion, sex, sexual orientation, gender identity, national origin, pregnancy, veteran status, disability, or any other status protected by law. We are an equal opportunity employer.

ImagineX is a rapidly growing technology consulting firm with an obsession for consistently delivering excellence to clients while always remaining passionate for what we do. At IX, we understand that happy consultants make happy clients, so nothing is more important to us than a strong culture, solid leadership, collaboration, open communication, and supporting our people each and every day. No big egos allowed! This role is 100% remote, and if you're based in Atlanta, you'll have the option to attend office meetings and events Tuesdays & Thursdays (office is located in Sandy Springs).
We're looking for a Senior Database Developer who enjoys working directly with clients solving complex problems. As a Senior Consultant, you will work closely with our clients performing database design and architecture responsibilities. You'll create database tables, stored procedures and functions, analyze performance and optimize stored procedures. You'll occasionally perform DMA duties and handle database migrations as well.
Other Responsibilities Include:
- API development using Microsoft .Net as a secondary duty
- Proposing solutions using best practices
- Debugging issues and fixing bugs
- Estimating the tasks and preparing plan for development
- Unit testing the code
- Mentoring Fellow Team Members is a Plus
Required Skills:
- 7+ years experience database development experience including building and integrating data at client sites with expertise in SQL Server 2014 or higher, complex T-SQL stored procedures, views, and user-defined functions
- Consulting experience with comfort communicating with all levels of stakeholders including executives, application developers, business users, and customers
- Drive requirements gathering sessions with business and technical staff to distill technical requirements from business requests.
- Documentation and design documentation experience, allowing complex concepts to be communicated clearly


data🇺🇸usa only
Company Overview:
Beam (fka Edquity) is an anti-poverty technology company that helps institutions and governments administer various cash assistance programs in a manner that is faster, more equitable, and more effective. Beam works with over 40 postsecondary institutions and multiple local governments and has administered over $110M in funding in an average of 25 hours from application.
Beam is a Series A stage, venture-backed company and has received support from many of the leading impact and postsecondary success investors, and has also received non-dilutive support from foundations like the Bill and Melinda Gates Foundation.
Job Overview:
Beam is seeking an experienced Data Analyst II who is passionate about enhancing financial stability and social mobility for Americans.
The data team at Beam is small but mighty. We are seeking an experienced data analyst who wants to add value across multiple teams, including but not limited to Operations, Product, Design, Reporting products, and Data Infrastructure. The ideal candidate for this role has worked with similar organizations in a data function and has technical expertise applying SQL and Python and/or R to analytics and data problems. They should have experience with building data products (e.g. dashboards) from start to finish, including requirements gathering, pipeline design, prototyping and automation. At Beam, we value stewardship and solutions, so this candidate should have a strong sense of ownership and independence.
We are building a robust data analytics suite on an AWS toolset. Tools we use include RDS, S3, Glue, and Redshift. We are building our data viz out in Glean and we work with Heap for product analytics. Familiarity with these or similar tools is preferred.
Ideal Start Date: by January 2023
What you might do:
- Serve as a primary point of contact for reporting requests from our partners
- Design and develop scalable data models where highly complex data relationships exist
- Architect and maintain highly automated and scalable ETL/ELT data pipelines
- Provision, optimize, and maintain data feeds to external systems
- Create visualization and reporting as needed
- Creating well-written technical documentation
- Design new functionality in our existing components for the team to leverage
- Evangelize best practices around data and team processes
This posting lists a lot of qualifications, but we recognize that the right candidate may not match 100% of the job qualifications. Importantly, we want you to grow with the company, so we are open to a candidate who is willing to grow in core competencies. Therefore, we encourage you to apply if you can demonstrate many, but not all of these skills and competencies and are willing and excited to grow!
Core Skills, Behaviors, and Experiences:
- Experience writing SQL and Python, solving complex data problems, good data modeling, as well as building automated data pipelines
- Expert at writing and optimizing SQL scripts
- Comfortable reading, editing, and potentially contributing to Python or PySpark scripts.
- Be a quick learner, self-starting, and have the ability to maintain and build within a horizontal and vertical scaling data system
- Experience with cloud data environments
- Experience building data visualizations in dashboarding software
- Experience working in government technology or the social impact space is not required but is a plus
- Experience with statistical analysis, including A/B testing, regression, and contingency analysis methods is not required but is a plus
Compensation and Benefits:
The salary range for this position has been benchmarked in relation to the scope of the role, market rate, company stage, and internal equity. The salary for this role will be between $90,000 - $110,000. Where a candidate falls within the band is determined by skillset, experience level, and geographic location. In addition to base salary, this role will come with a total compensation package that includes equity shares and competitive benefits. Some of our benefits include:
- Fully paid health insurance (Medical/Vision/Dental)
- Unlimited PTO, Sick and Mental Health Benefits
- 11 paid company holidays
- 401k with a 4% match
- Generous parental leave
- Annual Professional Development Stipend
- One-time Home Office Setup Stipend
- Equity in Beam
- Many more!
Beam (fka Edquity) provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

Education & Services Data Entry Clerk
(REMOTE – Work from Home), United States
Mission Operations & ResearchTemporary
How many people can answer the question, “What do you do for a living?” with the answer, “I help find cures for cancer.” At LLS, employees take our mission seriously. Whether you work in one of our chapters, are an accountant at the national office or a specialist in our Information Resource Center, you work each day on making our mission a reality: Cure leukemia, lymphoma, Hodgkin’s disease and myeloma, and improve the quality of life of patients and their families.
Join us and give new meaning to the word, “job.”The Data Entry Clerk plays a supportive role in data capture within Education & Services. This inidual will perform ongoing data entry, data quality review and cleanup in a Salesforce Lightning environment. He or she will work in partnership with the Director, Data Quality Improvement & Training to ensure timely and quality data entry.
***This is a TEMPORARY position staffed by TargetCW. The assignment will not exceed 6-12 months. 20-30 hours per week remotely from home.***
While employees may be permitted to work remotely, travel to the assigned office, HUB or Satellite Offices may be required as determined by the employee’s manager and the employee’s Strategic Talent Partner.
All LLS employees are required to provide proof of vaccination for COVID-19 before beginning work at LLS unless a valid medical, religious, or other exemption is approved consistent with applicable law.
Duties and Responsibilities:
- Enter data for new patient/caregiver/healthcare professional contacts and update existing contacts within the Salesforce database.
- Ensure data entry is done in a timely manner to allow for LLS follow-up with financial assistance program recipients.
- Review data entry in detailed fashion for accuracy and evaluate source data for any inconsistencies.
- Additional Salesforce data entry as needed.
- Perform other related duties as assigned.
Position Requirements:
- Superior attention to detail and dedication to accuracy.
- Excellent ability to work independently as well as collaboratively.
- Strong computer skills and experience with Salesforce and Microsoft Excel required.
- Home office setup which includes laptop/desktop and phone.
- Quiet working area away from distractions.
Education & Experience Requirements:
- Experience with data platforms, entering, and reviewing data.
- 1-3 years’ relevant experience working, preferably in a non-profit and/or healthcare delivery setting.
LLS is an equal employment opportunity employer committed to ersity and inclusion in its workplace. All qualified applicants will receive consideration for employment without regard to sex, gender, gender identity and expression (including transgender status), sexual orientation, pregnancy, age, race, color, creed, national origin/ancestry, citizenship, religion, genetic predisposition or information, physical or veteran status, or any other characteristic protected by law. LLS seeks such skilled and qualified iniduals to share our mission and where they will join a cohort of others who have chosen to call LLS home. LLS complies with all applicable laws which prohibit discrimination and harassment in the workplace, and is committed to maintaining a erse and inclusive environment supportive of all of our employees and the communities we serve. **All LLS employees are required to provide proof of vaccination for COVID-19 before beginning work at LLS unless a valid medical, religious, or other exemption is approved consistent with applicable law.**
Our client is a startup from Israel working in the financial sector. The SAAS product developed by our client protects the financial transactions of end users from fraud using AI / ML technologies.
We are looking for a motivated team member to join the distributed engineering team to develop the next version of the product.
As a Data Scientist, you will be in charge of constructing the company's Machine Learning pipelines and models. You will research the performance of the models and take part in devising new features to improve that performance and combat new and emerging fraud patterns.
Responsibilities:
- Improving the performance of the company's Machine Learning models
- Adapting the company's models to the needs of new clients
- Researching the performance of the company's Machine Learning products and suggesting new features
Skills:
- A brilliant and experienced data scientist
- 3 years of experience as a data scientist and software engineer
- Bachelor's degree in Computer Science, Mathematics, Statistics, or a related field
- Significant experience in building machine learning models and applying statistical techniques
- Strong Proficiency in working with Python
- Experience working with SQL and NoSQL databases
- Git experience
- Loves the startup spirit and would like to join the ride
- Competitive market salary
- Flexible working hours
- Paid sick leaves and vacations
- Team of professionals who knows how to build world-class products
- Wide range of excellent opportunities for professional and personal growth


location: remotemassachusettsus somerville
Medical Record Reviewer
somerville, massachusetts(remote)
$17.99 – $18 per hour
temporary
high school
shift: First
work hours: 8 AM – 4:30 PM
category office and administrative support occupations
reference AB_4126442
job details
Mass General Brigham is a Boston-based non-profit hospital and physicians network that includes Brigham and Women’s Hospital and Massachusetts General Hospital, two of the nation’s most prestigious teaching institutions. As a world-recognized leader in research, we are home to one of the largest hospital system-based research enterprises in the U.S.
The primary responsibility of this position will be managing the HEDIS medical records review project including coordination of medical records data collection and chart reviewing. This position will facilitate appropriate medical records data collection for the EQRO (External Quality Review Organization) auditing and all other medical records review, such as medical records document standards review and also facilitate collection of race/ethnicity and language data from PCP practice sites.
100% REMOTE – MUST be able to work Mon-Fri from 8am-4:30pm EST
Responsibilities
- Manages relationship with provider sites to collect data for HEDIS, EQRO, and NCQA data for various projects in AllWays Health Partners
- Accurately and efficiently conduct electronic record reviews
- Successfully complete required medical record reviewer training and Inter-Rater Reliability testing
- Communicate effectively and professionally with outside provider offices, clinics, and hospitals
- Participate in various medical record abstraction projects aimed at measuring outcomes in support of quality improvement projects
Requirements:
- High School Diploma
- HEDIS experience, Medical Review experience, Excel + Word.
- Ability to use HEDIS medical records review tool and to manipulate data in Excel and/or Access
- Junior or Senior call center experience
- strong computer skills, able to navigate variety of EMR systems
- Able to work independently, strong communication skills, detail oriented
- Able to manage multiple projects and meet strict deadline requirements
Skills
- Entering Data
- Data Entry
- Data Collection
- Basic Computer Skills
- HEDIS
- EMR
- Microsoft Office
- Excel
- Microsoft Excel
- Medical Records
- Medical Review
- Call Center
- Call Center Support
Qualifications
- Years of experience: 1 year
- Experience level: Entry Level

data🇺🇸usa only
🚀 About Pulley
Pulley's mission is to make it easier for anyone to start a company. We believe that more startups should exist and that founder-led companies are more successful in the long term. With Pulley’s cap table management tools, companies can better understand and optimize their equity for the long term. Starting a company is hard enough. Managing equity shouldn’t be.
We’re a high-performing team looking for passionate, execution-focused, self-starters to help us build equity management tools for founders. Pulley is growing quickly with over 3000 customers including unicorns like Clubhouse, Birdies, and Coda - all within our first year of product launch. Our trajectory is fueled by top investors like Founder's Fund, Stripe, General Catalyst, Caffeinated Capital, 8vc, Elad Gil, among other great angels. All of our growth has been organic, and we’re growing the team to meet the demand.
🌟 Who We're Looking For
We’re looking for a Data Analyst who’s excited about using data to drive actionable business and product decisions. As a Data Analyst, you would work with a broad set of partners such as product managers, engineers, and business stakeholders. You will work on product analytics, metric reporting, data quality, and dashboard development, to surface insights and recommendations and create a deeper understanding of customer behavior.
🛠 Responsibilities
-
Drive business intelligence through dashboards by owning strategy, design and build of dashboards. You will also collaborate with stakeholders to identify improvements to data visualizations for delivering insights in an easily digestible format.
-
Work with team members to collect, analyze, and visualize data. Then develop presentations and reports based on recommendations and findings to drive product growth and build a better experience for startups on the Pulley platform.
-
Develop and maintain high-quality and unified data by improving linkage across internal business systems, defining source-of-truth business logic, and fixing errors in data logging.
-
Collaborate with cross-functional partners to identify patterns of behavior and drive product decisions with clear data narratives. Then work with teams to establish KPIs to measure the effectiveness of new features.
-
Prepare reporting for tracking company and department-level KPIs that effectively communicates trends, patterns, and predictions. You will also synthesize signals from product usage to customer interactions to understand root causes of metric movements.
🙌 Qualifications & Fit
-
You are a fast learner - You can readily learn most technologies as you go
-
Strong verbal and written communication skills, you are interested in the practical applications of converting data into insights and narratives
-
You have a strong work ethic and willingness to do whatever it takes to deliver results under deadlines and have excellent attention to detail
-
Experience with: data manipulation, cleaning, analysis and visualization, SQL
-
Preferred: Experience with Python/R/Stata or other statistical languages
💚 Benefits
-
Generous health insurance
-
Unlimited vacations
-
Commuting & food
✨ Our Culture
TL&DR - Pulley is not a good fit for everyone, and that’s OK. This is a bit about the culture of Pulley. We need people who aren’t just interested in working at a startup; we need people who are excited about building a great company. In addition to function-specific skills, here are some traits that enable our team members to thrive:
-
Ego-less Learner -- Every day at Pulley, we are trying things that we’ve never done before - sometimes they work; sometimes they don’t. To be successful here, you need to embrace that feeling of ‘being a beginner’ and be willing to accept feedback in stride.
-
Less Talk; More Action -- Everyone rolls up our sleeves at Pulley. If your strength is being the ‘idea gal or guy’ but you don’t create your own slides or write your own code, then we are not a great fit.
-
Prioritize ruthlessly - Our view is that being a 10x engineer is not about writing more code; it’s about making the right decision on what to build. We’re hiring for people who can make the right strategic decisions on where to spend their time
We’re also inspired by the culture work done at other companies and plan to develop our own playbook here too. One of the concepts that resonate most with us is the importance of stunning colleagues.
If you like the sound of our environment and you’re passionate about joining a team like the one we’ve described, we'd love to talk!


location: remoteus
Data Entry Specialist
- Job Category: Claims
- Requisition Number: DATAE006816
- Full-Time
- Richmond, VA 23173, USA
- Remote, USA
Job Details
Description
The Data Entry role is a very dynamic position where the associate will be supporting a critical part of Intake Operations for CorVel. The Data Entry agent is responsible for entering into CareMC email, fax, or written correspondence claim requests. The role requires the inidual to manage their own work that has been assigned to them on a daily basis. This role is remote and does require the associate to hit daily production numbers to ensure timeline are met for our clients.
This is a remote work opportunity. Schedule for this role is 9-6pm, M-F with rotating weekends.
ESSENTIAL FUNCTIONS AND RESPONSIBILITIES:
- Enter into CareMC application any assigned Email or Faxed requests from customer.
- Use all available resources to correctly follow the clients filing requirements and/or fill in missing information.
- Ability to multitask between applications utilizing 2 monitors
- Consult with Supervisor or other departments to clarify answers to inquiries.
- Contact AM if applicable to complete filing of claim
- Maintain reports and spreadsheets as needed.
- Requires punctual and consistent attendance.
- Ability to hit daily performance goals
- Exceptional quality on daily work
- Additional duties as assigned.
KNOWLEDGE & SKILLS:
- Proficiency in entering in Workers Comp claims in CareMC
- Proficient in Microsoft Office 2016 tools (Word, Excel, Teams)
- Solid Communication skills written and verbal
- Attention to Detail and ability to prioritize work
- Decision Making and Time management skills
- Customer/Client Focus and Teamwork
- Problem Solving/Analysis
- Schedule flexibility
EDUCATION/EXPERIENCE:
- Filing Workers Comp claims in Caremc
- High school diploma or equivalent
About CorVel
CorVel, a certified Great Place to Work® Company, is a national provider of industry-leading risk management solutions for the workers’ compensation, auto, health and disability management industries. CorVel was founded in 1987 and has been publicly traded on the NASDAQ stock exchange since 1991. Our continual investment in human capital and technology enable us to deliver the most innovative and integrated solutions to our clients. We are a stable and growing company with a strong, supportive culture and plenty of career advancement opportunities. Over 4,000 people working across the United States embrace our core values of Accountability, Commitment, Excellence, Integrity and Teamwork (ACE-IT!).
A comprehensive benefits package is available for full-time regular employees and includes Medical (HDHP) w/Pharmacy, Dental, Vision, Long Term Disability, Health Savings Account, Flexible Spending Account Options, Life Insurance, Accident Insurance, Critical Illness Insurance, Pre-paid Legal Insurance, Parking and Transit FSA accounts, 401K, ROTH 401K, and paid time off.
CorVel is an Equal Opportunity Employer, drug free workplace, and complies with ADA regulations as applicable.

dataspain
Who we are
We're a global marketing service provider and we specialize in affiliate marketing & publishing. We are digital natives, data obsessed and focused on measurable outcomes. Some of the most talented iniduals you'll ever meet, all with one thing in common: doing great work, and growing as a team.
Our value lies in Diversity, Equity, Inclusion and Belonging. We strongly believe in equality and stand against all kinds of discrimination. We dare to be unapologetically ourselves. Come join a team of explorers who are motivated by growth, and driven by results.
What you'll do
The successful candidate will be responsible for the integration of Machine Learning models in production environment in agreement with the global R&D roadmap. Following software development good practices, he will facilitate operations between data scientists and software engineers. In collaboration with the rest of the R&D team and the development team, he will help to develop state-of-the-art products that leverage big data towards innovative SEO and conversion optimisation and website creation. This position reports to the R&D Engineering Team Lead.- Work closely with the rest of the R&D team to gather data that can be used for model development;
- Work closely with the rest of the data scientists to convert experimental code into production code following development team practices;
- Follow an agile and flexible team organisation in order to deliver optimal performance for each project;
- You are a Data engineer who can: o Work closely with the stakeholders and the data scientist to define product requirements; o Collaborate with internal team members (SEO, Conversion, Data scientists) on best practices, prioritization and integration of end-to-end ML products; o Deliver exceptional work and innovative models; on strategy and on-time; o Remain up-to-date with industry standards and technological advancements that will improve the quality of your outputs; o Monitor and anticipate data related issues, its management and governance;
What you'll bring
- Bachelor's degree or equivalent in computer science, or proven equivalent experience in a technical organization or related field;
- 1+ year of experience in relevant positions;
- Proficiency in Software Engineering skills along with a deep understanding of Python, code review best practices and industry standard methodologies;
- Database management and operations (relational, schema-less, columnar);
- Proficient with ETL tools such as Airflow and Analytics Engines for large-scale data processing such as Spark and its ecosystem;
- Attention to details and ability to multi-task in a challenging and fast environment;
- Strategic thinker, able to anticipate, think critically, evaluate, and act;
- A communicator and relationship builder, able to navigate effectively across organizations and culture.
- Experience with Unix-based systems, including bash programming
Strong consideration will be provided to candidates that also offer the following:
- Previous experience on taking ML/DL models into production, including lifecycle management, version control (DVC);
- Experience of SQL preferably knowledge of Athena
- Experience with Agile Project Management tools such as Jira;
- Experience in cloud platform (GCP, AWS or Azure), AWS will be a plus (Athena, S3, EMR Serverless);
- Prior experience with SCRUM/Agile methodologies;
What's in it for you
- Relocation & soft landing for you and your family (applicable in Spain, if you are moving from a different city/country)
- Attractive salary
- Competitive private health & life insurance package
- Flexible working hours and remote-friendly tools and methodologies to stay connected
- Workplace perks such as coffee, fresh fruit, bread, sweets and drinks provided daily
- In addition to regular leave yearly, six weeks' paid leave for every four years of service
- Be part of a multicultural environment
- Wellbeing programme
- Different employee events throughout the year and team building activities
- Career Development training and programs to help you grow!

Senior Healthcare Data Analyst Irvine, CA / Glendale, CA / Columbus, OH Remote Opportunity
Do you like innovation and technology? We do, too. Reveleer is a healthcare software company serving health plans and care management organizations. Our teams are dedicated to disrupting and modernizing healthcare by building powerful technology that provides an amazing user experience, scalability, and automation.
Reveleer is the first of its kind in our space – a self-service platform that mines health data using AI, NLP, and advanced analytics. Our Platform enables health plans to focus on the quality of care people receive, leaving the most complex and repetitive tasks for technology to solve. Be part of what Reveleer is building next!
As the Senior Healthcare Data Analyst, your primary responsibility will be to partner with internal teams and customers with onboarding programs to the Reveleer platform. Programs include, but not limited to Risk Adjustment Initiatives, HEDIS Quality Programs and CMS Government Audits, CMS Enrollment and Encounter submissions, etc. The Senior Healthcare Data Analyst will help to ensure a smooth exchange of data between our healthcare clients.
We're a hardworking, energetic team that is passionate about our customers and believes that to be successful we should never stop learning: learning about our customers, our product, and how to build better software.
What You'll Do:
- Work with internal and external customers to map, transform, validate and load customer data for all products.
- Troubleshoot and resolve data issues for prompt and accurate data loads.
- Research data questions, create ad-hoc reports, etc
- Work with internal and external customs to map, create, QA and implement data extracts for all products.
About You:
- Experience with US Healthcare / Managed Care Organizations, specifically around Compliance and RISK Assessments programs
- Working knowledge of healthcare industry and health data (provider demographics, patient data, Quality/Risk program information, compliance standards) is strongly preferred with background specific to quality measures (HEDIS), risk adjustment, and medical record review or audit
- Comfortable working with data tools: Microsoft SQL, Excel. Preferred:
- Experience working on SaaS web applications
What You'll Receive:
- Competitive Salary
- Medical, Dental and Vision benefits
- 401k
- PTO Plan
- Fully stocked breakroom with drinks and snacks
- Dog friendly office
- Social gatherings
Reveleer is an equal opportunity employer. We do not discriminate on the basis or race, religion, color, national origin, gender, gender identity, sexual orientation, age, marital status, veteran status, disability status or genetic information, in compliance with applicable federal, state and local law.


dataeurope
< class="h2">About Us

At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams, and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.
< class="h2">About the Role
Data is at the core of everything we do at Pagos, and our Data Analysts play a key part in ensuring the quality and consistency of our data.
With a keen eye for detail and a collaborative approach, you’ll partner and collaborate with our Engineering and Product teams. In this role, you’ll focus on testing data pipelines, debugging rogue SQL queries, and tracking down data inconsistencies.
< class="h3">As a Data Analyst, you will:
-
Own data quality across the entire platform
-
Design scalable and reliable test plans to ensure consistent data delivery
-
Find, research, and track bugs
-
Plan and coordinate testing activities alongside cross-functional teams
-
Help translate customer and product requirements to create and validate data models
-
Proactively seek and execute on solutions for day-to-day project challenges
< class="h2">What We’re Looking For
We’re looking for someone with:
-
2-4 years experience in Data Analysis
-
Excellent SQL knowledge, with the ability to write and run SQL queries effectively and independently to gather test data needed for set ups, manipulation, and analysis.
-
Knowledge of fundamental statistical concepts for data validation
-
The ability to read Python code and create tests of code logic using testing frameworks, preferably with some programming capability for creating and modifying automated tests
-
Experience with JSON files and interpreting data in said file
-
Experience working with complex, large-scale relational databases and other tools to compile and interpret quantitative data
-
The ability to work in a high-pressure, deadline-driven environment
-
Advanced experience with Microsoft Excel or Google Sheets
-
Exceptional communication skills—both written and verbal—while working with an asynchronous global team
Nice to haves:
-
Experience programming in Python, Java, C#, or a similar language
-
Experience with Redshift
-
Experience in the fintech/finance industry

Ballotpedia is seeking to hire a full-time, 100% remote Election Data Researcher.
Reporting to Ballotpedia’s Director of Research, the Election Data Researcher will be responsible for conducting research and monitoring sources, primarily on the Web, to generate and maintain a comprehensive election date calendar, including elections from the federal and state level all the way down to the smallest cities and school districts. The position will also be responsible for executing complex research protocols about a variety of other political topics.
As an Election Data Researcher, you will grow to develop a good understanding of local election administration processes and know how to appropriately research, categorize, and summarize varying systems to fit the structure required to generate Ballotpedia’s election calendar.
This is a great position for a skilled researcher who is passionate about the opportunity to help expand fact-based, neutral election coverage in the United States for American voters, especially at the local level.
Responsibilities
As an Election Data Researcher, you will:
- Research reliable elections information sources and logic for Ballotpedia’s automated election date crawler technology.
- Monitor and process the results of the automated crawler to generate and maintain Ballotpedia’s election calendar.
- Participate in other Research department activities, including:
- Conducting election laws research and research on other pertinent political topics
- Executing detailed protocols for miscellaneous research projects
- Helping to develop, write, and audit detailed research protocols for miscellaneous projects
Qualifications and Characteristics
An ideal Election Data Researcher will:
- Have exceptional attention to detail and an ability to understand complex processes and consistently follow complex instructions without errors over a large number of iterations.
- Have excellent research skills. They can follow general guidelines to locate appropriate online and direct outreach sources and record information in an organized, concise manner to answer detailed questions.
- Have sharp critical thinking skills and are able to understand the varied structures of a large number of sources and apply more general categories and logical statements to those structures.
- Enjoy organizing their own lives and the world around them through elegant structures. They maintain orderly habits and perceive external disorder as an opportunity to innovate and improve.
- Enjoy receiving and implementing critical feedback for constant improvement.
- Are able to accurately and concisely summarize complex concepts, difficult text, and detailed structures.
- Have a passion for elections, election and voter information, and politics, with a commitment to Ballotpedia’s mission to remain
- They are adept at using utilities such as Google Chrome, Docs and Sheets, or similar tools to navigate, find, and record information from the Internet.
Environment
The Election Data Researcher will work remotely from their home location. All Ballotpedia staff work remotely. To join Ballotpedia, you must have a computer with Internet access.
Ballotpedia has a flexible work environment, BP Flex, in which every employee enjoys unlimited vacation and flexibility in scheduling. Each employee will be oriented to the principles of Ballotpedia’s flexible environment during new employee training.
Compensation
The starting pay range for the Election Data Researcher is $40,000-$50,000/year commensurate with experience.
In addition to salary, Ballotpedia offers an annual benefits stipend equivalent to $8,000 that is paid out in equal increments in each paycheck once an employee becomes benefits eligible. The stipend may be used to pay for a full benefits package, including health, vision, and dental insurance; retirement accounts; and more. If benefits are not elected, the stipend is taxed as regular income and added to salary.
To Apply
To apply, visit the Ballotpedia job opportunities page and fill out the form.
Please attach the following in PDF format:
- résumé
- cover letter detailing your interest in Ballotpedia’s mission, this position and your salary expectations
Please ensure that either your résumé or your cover letter include your current address.
About Ballotpedia
Ballotpedia is a collaborative team of fast learners and creative problem solvers who are eager to work hard to make the world a better place. We believe the world will be a better place if every citizen has access to information to make informed decisions about their vote in every election in which they are eligible to vote: primary, general, and special elections; federal, state, and local offices.
We work diligently to present the available information about elections, candidates, judges, ballot measures, policies, and more in a way that enables our readers to vote with confidence and to act as engaged citizens outside of the polling booth.
Ballotpedia readers, like Ballotpedia staff, are special people.
When we launched in 2007, we did not go out of our way to seek new readers. Starting with our small team of visionary idealists, nerds, and aspiring political journalists, we just wrote the best unbiased online articles we could, especially about ballot measures. Readers found those articles in droves. It turns out there was an unclaimed audience out there—people who wanted straightforward facts about political issues, and were willing to read at length instead of just scanning the headlines.
“If you build it, they will come:'' our readers came to our neutral oasis in growing numbers; we’ve had many millions of lifetime pageviews, we reached nearly half of all voters in 2020, and, in the month surrounding the November 2020 election, we were the 77th most-visited website in the U.S.
We’ve come to realize that we need to meet our readers where they are. In doing so, over the past five years, we’ve grown our email newsletter program from infancy to include more than 1,000,000 opt-in subscribers with more than a dozen newsletters to choose from. We are working in numerous ways to help put our neutral information in front of people at the times when they most need it, including on mobile phones while you’re standing in the voting booth. We firmly believe that our readers, and the mindset we help them cultivate, are essential in a world where too many others are fighting to get us all addicted to sensational posts and the irrational decisions they foster.
If this is a mission you’d be willing to work hard to achieve, and if this is a team you’d be willing to work hard with—JOIN US.

We are honu.ai, a venture-backed startup building a unique platform that will give small business owners worldwide superhuman decision-making capabilities. We are currently in stealth mode and have raised a seven-figure pre-seed round from a fantastic roster of investors, who are early backers of TransferWise, Tide, Coinbase, CityMapper, Cazoo, Nested, Habito, Cleo, King, and Snyk. We have also just secured a Smart Grant from the InnovationUK fund.
The technology we are building is novel and we will be pushing the boundaries of what has been done in this space. The founder is ex-Goldman Sachs, and ex-Facebook, a second-time entrepreneur with a background in robotics and neuroscience. This is a unique opportunity to join a highly ambitious, venture-backed company, on the ground floor. We are London based and remote-first.
Today we are a team of ten; we are seeking exceptional technologists who have experience designing and building complex products and deploying them at scale. We're taking a revolutionary approach and people who bring a ersity of experience and a track record of creative solutions are particularly welcome.
We are seeking a full-stack Data Scientist with strong foundations in numerical methods, ML, and statistical techniques - who has experience being part of the initial data team in a startup. You will be one of the first data science hires in the company and be part of the core team of 10 building the first iteration of a unique product.
You Will
- Work closely with the Founder and the wider tech team to build the various intelligent components of the platform
- Solve a erse range of problems that cover a wide spectrum of complexity
- Contribute to building the data infrastructure, working alongside a team of engineers
- Approach tasks with pragmatism, using sound judgement and an 80/20 approach in a rapidly evolving environment
- Be embedded in multiple areas of the business; as a member of the core team you will also be expected to become well-versed in the business context, understand our clients, and the problem space, and then translate them into impactful solutions
Requirements
About You
- You’re an impact driven, product-centric data scientist
- Ideally you possess experience across erse sectors; eCommerce, MarTech, Decision Intelligence or FinTech would be a big plus, alternatively, it would also be beneficial to have been a part of scaling a B2C product
- You are experienced in building models (i.e. forecasting to predict customer behaviour) as well as, ideally, deploying them into production
- You’re a strong systems thinker able to see the bigger picture
- You possess extensive technical proficiency in Python and are comfortable with a range of libraries including Pandas, scikit, NumPy etc. as well as PyTorch/ TensorFlow and, ideally, cloud-based architecture (AWS/GCP/Azure etc.)
- You understand the applied side of data science with 3+ years of industry work experience and at least 1 of those in a startup
- You’ve worked closely with data engineering and have been involved in building ETL pipelines from scratch
- You should be able to operate autonomously, to a large degree, whilst knowing you have the support of engineers when needed
- You have experience connecting systems to external APIs in order to ingest third-party data
- You value honest feedback and creating an environment where it flourishes
- You're comfortably organised amongst the chaos
Tech Stack
Our initial stack is Python centric with some ancillary GoLang services for the backend, React / Typescript on the front end, and we are running on AWS. We have a pragmatic, non-dogmatic approach and will refine the tech stack, finding the right tools for the job, as the product evolves. Scientific computing will be mostly implemented in Python, for now.
Logistics
- Introductory phone call with our Recruiter (30 mins)
- Technical screening call with the Founder (30 mins)
- Technical assessment
- Meet the team (30 minutes x 3 interviews)
- Chat with Founder (30 mins)
Benefits
We are a young business with a huge amount of evolution to come. Alongside a salary of between £60K - £90K, we are able to offer:
- Competitive stock options package (between 0 - 0.5%)
- Remote first working policy
- Regular company offsites
We are growing fast and will be sure to respect your time and expectations each step of the way. We embrace ersity in all forms and are dedicated to building an inclusive, equitable environment for all people to do their best work with us.
We’re an equal-opportunity employer. All applicants will be considered for employment without attention to ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neuroersity status or disability status. If there are any suitable adjustments that can be made to ensure a smooth interview process, please do let us know.


location: remoteus
Data Entry Specialist
- Job Category: Customer Service
- Requisition Number: DATAE003442
- Full-Time
- Virtual
- United States
Please not the shift hours for this position are 7:30 am – 4:30 pm Pacific Time, Monday – Friday.
We are looking for a highly-detailed and tech-savvy Data Entry Specialist to join our Operations team! The ideal candidate will be responsible for quickly and accurate transferring merchant information from incoming digital applications to our various platforms and systems. Attention to detail is essential and timely account entry and boarding ensures that Signature Payments is able to offer the highest level of service. The Data Entry Specialist is responsible for quality control, data integrity, and account maintenance functions as well.
What you’ll do:
- Gather and input data into databases; Key in new merchant applications to Signature Payments’ internal and partner systems
- Execute and key account changes
- Review data for errors, make corrections, and quality control newly boarded accounts
- Review and address data integrity issues
- Complete High Risk Registration Process (Fees and Renewals)
- Set up new hierarchies including associations, security groups, and users
- Complete application receipt confirmations in a timely manner
What we need from you:
- High school diploma or equivalent
- Strong data processing skills
- Minimum typing proficiency of 60 WPM
- Excellent written and oral communication skills
- A keen eye for detail
- Highly organized and able to multitask, prioritize, and adhere to deadlines
- Familiarity with multiple software applications including Microsoft Office (Word, Excel, Outlook) and Adobe Acrobat (PDF)
- Flexible and proactive in taking on assignments and responsibilities; strong desire to learn
- Ability to use or adapt to new programs, software, equipment, and/or hardware
How to be an all star:
- Experience with payment processing platforms including TSYS, First Data, etc.
- Familiarity with interchange and card brand pricing elements
Who we are:
Signature Payments is a payments technology company that provides solutions that allow merchants and partners to integrate and accept payments anytime, anywhere. With more than 20 years as a trusted payments provider, Signature offers a variety of products and programs for merchants of all sizes across industries.
North American Bancard helps businesses prosper through smarter commerce for all and we’re committed to building the next generation of payment technologies and experiences. More than 250,000 satisfied merchants trust us to process more than $34 billion worth of transactions every single year. Propelled by cutting-edge technology, user-friendly products, and features, and the best customer service in the business, we continue to innovate to satisfy our clients and their customers.
At North American Bancard, we celebrate ersity and create an inclusive environment for everyone. We are an equal opportunity employer.
What We’re Looking For:
Cover Whale is searching for a Data Engineer to join our rapidly growing Data team. This position will provide an excellent opportunity to impact multiple business areas. At Cover Whale, we rely on data to drive our systems and solutions. We are seeking an experienced data engineer to continue to help us revolutionize the InsureTech space.
What You’ll Do:
- Primary focus will be on building out our ETL processes.
- Communicate highly complex data structures to organizational leaders.
- Provide technical expertise, thought leadership, and architectural guidance to our Data and Engineering teams.
- Collaborate with Product Management, internal stakeholders, and external vendors to continually improve deployment and operations.
- Foster a continuous learning environment through continuously improving processes, procedures, and approaches.
- Work with cross-functional Data teams to develop, manage, and implement products and technology roadmaps in alignment with long-term strategies.
- Create proof-of-concept and pilot technology demonstrations to stakeholders surrounding microservice architecture and service meshes.
Requirements
- Three or more years of experience with Python and SQL
- Located in the US
- Linux expertise
- Familiarity with the AWS ecosystem, specifically API Gateway, Kinesis, Athena, RDS, and Aurora
- Experience building ETL pipelines for analytics and internal operations
- Experience building internal APIs and integrating with external APIs
- Effective communication skills, especially for explaining technical concepts to nontechnical business leaders.
- Desire to work on a dynamic, research-oriented team.
- Experience with distributed application concepts and DevOps tooling
- Excellent writing and communication skills
- Troubleshooting and debugging ability
Nice to have:
- 2 or more years of experience with Hadoop, Spark and Airflow
- Experience with DAGs.
Benefits
- Competitive Pay
- Medical/Dental/Vision
- Short/Long Term Disability
- Employer Paid Life Insurance and AD&D
- Matching 401k Plan
- Flexible Vacation Policy
- Generous Parental Leave Program
- Employee Assistance Program
We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

What We’re Looking For:
Cover Whale is searching for a Data Engineer to join our rapidly growing Data team. This position will provide an excellent opportunity to impact multiple business areas. At Cover Whale, we rely on data to drive our systems and solutions. We are seeking an experienced data engineer to continue to help us revolutionize the InsureTech space.
What You’ll Do:
- Primary focus will be on building out our ETL processes.
- Communicate highly complex data structures to organizational leaders.
- Provide technical expertise, thought leadership, and architectural guidance to our Data and Engineering teams.
- Collaborate with Product Management, internal stakeholders, and external vendors to continually improve deployment and operations.
- Foster a continuous learning environment through continuously improving processes, procedures, and approaches.
- Work with cross-functional Data teams to develop, manage, and implement products and technology roadmaps in alignment with long-term strategies.
- Create proof-of-concept and pilot technology demonstrations to stakeholders surrounding microservice architecture and service meshes.
Requirements
- Three or more years of experience with Python and SQL
- Located in the US
- Linux expertise
- Familiarity with the AWS ecosystem, specifically API Gateway, Kinesis, Athena, RDS, and Aurora
- Experience building ETL pipelines for analytics and internal operations
- Experience building internal APIs and integrating with external APIs
- Effective communication skills, especially for explaining technical concepts to nontechnical business leaders.
- Desire to work on a dynamic, research-oriented team.
- Experience with distributed application concepts and DevOps tooling
- Excellent writing and communication skills
- Troubleshooting and debugging ability
Nice to have:
- 2 or more years of experience with Hadoop, Spark and Airflow
- Experience with DAGs.
Benefits
- Competitive Pay
- Medical/Dental/Vision
- Short/Long Term Disability
- Employer Paid Life Insurance and AD&D
- Matching 401k Plan
- Flexible Vacation Policy
- Generous Parental Leave Program
- Employee Assistance Program
We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

Mediavine is seeking a Data Engineer to join our engineering team. We are looking for someone who enjoys solving interesting problems and wants to work with a small team of talented engineers on a product used by thousands of publishers. Applicants must be based in the United States.
< class="h3">About MediavineMediavine is a fast-growing advertising management company representing nearly 10,000 websites in the food, lifestyle, DIY, and entertainment space. Founded by content creators, for content creators, Mediavine is a Top 20 Comscore property, exclusively reaching over 125 million monthly unique visitors. With best-in-class technology and a commitment to traffic quality and brand safety, we ensure optimal performance for our creators.
< class="h3">Mission & CultureWe help content creators build sustainable businesses. From educational tools and cutting-edge plugins to ad technology that maximizes earnings without slowing down your site, our motivation is to ensure their brand and business grow in every respect.
We are striving to build an inclusive and erse team of highly talented iniduals that reflects the industries we serve and the world we live in. We are committed to creating a culture where everyone feels welcome. We are looking for iniduals that will challenge us to continuously evolve and make Mediavine the employer of choice for people of all backgrounds. We strongly encourage minorities and iniduals from underrepresented groups in technology to apply for this position.
Diversity and inclusion aren't platitudes to us; we take them seriously. Have a look at our team and read through our blog posts to learn more about our values and discover if Mediavine is the place for you!
Position Title & Overview
The Data & Analytics team consists of data analysts, data engineers and analytics engineers working to build the most effective platform and tools to help uncover opportunities and make decisions with data here at Mediavine. We partner with Product, Support, Ad Operations and other teams within the Engineering department to understand behavior, develop accurate predictors and build solutions that provide the best internal and external experience possible.
A Data Engineer at Mediavine will help build and maintain our data infrastructure. Building scalable data pipelines, managing transformation processes, and ensuring data quality and security at all steps along the way. This will include writing and maintaining code in Python and SQL, developing on AWS, and using third-party tools like Rundeck, Metabase, and others to round out the environment.
Our current data engineering toolkit consists of custom Python data pipelines, AWS infrastructure including Kinesis pipelines, Rundeck scheduling, dbt for transformation and Snowflake as our data warehouse platform..
Essential Responsibilities
- Maintain and support data pipelines that make data available for analytic and application use cases
- Follow established best practices in creating new pipelines
- React to data quality notifications, working with users and other developers to communicate the issue and resolve
- Grow in technical knowledge and expand your ability to contribute to the team
- Support data analysts and analytics engineers ability to meet the needs of the organization
- Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices
- Build or implement tooling around data quality as needed
- Provide support when data issues are discovered by users or other team members
- Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users
Requirements
Location:
- Applicants must be based in the United States
You Have:
- 1+ years of experience in a data role
- Experience with Python (Understands libraries, classes, functions)
- Proficient SQL skills (CTEs, window functions)
- An understanding of data modeling concepts
- Experience checking in code to production with source control, pull reviews, approvals, etc.
- Experience working with DevOps to deploy, scale and monitor data infrastructure
- Scheduler experience either traditional or DAG based
- Comfortable working with cloud data warehouses (Big Query, Snowflake, Redshift)
Nice to haves:
- Experience with web analysis such as creating data structure that support product funnels, user behavior, and decision path analysis
- Experience with dbt
- Experience with orchestration tools particularly across different technologies and stacks
- The ability to make your teammates laugh (it wouldn’t hurt if you were fun to work with is what I’m saying)
- Familiarity with event tracking systems (NewRelic, Snowplow, etc)
- Experience with one or more major BI tools (Domo, Looker, PowerBI, etc.)
Benefits
- Remote work environment
- Travel opportunities (remember those!?)
- Comprehensive benefits including 401k, Health, Dental, and Vision insurance
- Learning allowance
- Generous Vacation/Time off policies
- Additional side benefits such as home-office upgrades, tuition reimbursement, paid gym memberships and wellness retreats, upgraded flights, cool swag and more
- Company match charitable donations
Mediavine is an Equal Opportunity Employer

About Us
Outsourced Doers is the world's fastest-growing virtual assistant company for online entrepreneurs. Outsourced Doers match trained marketing Virtual Assistants (who we call Doers) with busy, time-poor online entrepreneurs (who we call Founders). With over 1,000 Doers and Founders from over 50 countries, we're helping Founders globally to work less and achieve more.
About the Role
We are currently searching for a Data Insights Analyst to join our marketing team. The primary purpose of the Data Insights Analyst is to leverage business performance by providing up-to-date analysis on consumers, brands, and categories to appropriate internal and external stakeholders.
Role responsibilities range in nature and complexity, with key components including:
- Undertaking research projects involving detailed analysis and assessment of growth opportunities with the ability to distill key takeaways for the team.
- Work across omnichannel platforms to provide data analysis and insights into the business' advertising solutions.
- Complete ad hoc data requests from various stakeholders exposed to marketing, finance, customer support, and product areas.
- Take ownership of reporting key business and marketing metrics.
- Collaborate with growth team members to test and assess the efficacy of emerging growth channels.
About You
In addition to the ability to thrive in a fast-paced environment, balance multiple priorities, work under pressure and meet strict deadlines; we are seeking applications from candidates with the following skills and characteristics:
- A natural sense of curiosity and an urge to understand “why.”
- Highly analytical with experience combining and analysing large datasets from multiple data sources.
- A drive and ability to build something great.
- A team player with excellent communication skills.
- Ability to pick up new tools quickly.
What You've Done:
The following experience is required (or, in some cases, desired) for this role:
- A degree in Marketing /Communications or similar
- 3+ years experience in a similar position
- Understanding of research software, Microsoft Office programs, and market research skills.
- Excellent project management skills with the ability to organise and prioritise.
What Outsourced Doers offers:
- A chance to work within a fast-growth tech company.
- A positive work environment, where you are encouraged and supported to flourish and achieve success.
- Full-time home-based position with full government benefits including 13th Month Pay
- HMO upon regularization
- Bonus schemes, reward & recognition programs, and monthly team celebrations
- Great company culture & Work-life balance
- Monthly team celebrations
- Free and ongoing training in Digital Marketing
- You'll get to work with an intelligent, hard-working, and vibrant team
- Lots of support and motivation to be the best you can be
If you are interested in applying for this position, just hit the “Apply” button below.

eSimplicity is modern digital services company that delivers innovative federal and commercial IT solutions designed to improve the health and lives of millions of Americans while defending our national interests. Our solutions and services improve healthcare for millions of Americans, protect our borders, and defend our country on the battlefields supporting the Air Force, Space Force, and Navy.
eSimplicity's people-centric approach aims to transform the American healthcare experience through innovative technologies. Our team’s experience spans various federal civilian customers on erse projects across its core competencies. Our priority is to safeguardour community by leading the government’s cloud migration, developing artificial intelligencemodelsto identify fraudulent Medicare claims, and accelerating access to data and insights.
The Role
Data Pipeline engineers ensure that our customers work with updated and correct data in Foundry. You are the first to respond to data health failures on key pipelines. You are technical, with a passion for system support. You’re comfortable reading code to identify fixes and making changes to improve the performance and stability of production data pipelines. Working alongside pipeline engineers from other customer accounts, your insights into the product and process will form an essential part of feedback to our product engineering team, and other user and product support teams.
< class="h3">Core Responsibilities:

- Maintain build schedules so that pipelines run
- Setting up and maintaining health checks on different pipelines
- Respond to, triage, and debug pipelines when there’s a problem (usually this is when health checks fail). This will include limited out of hours support for critical issues (e.g. a critical subset of alerts can page you overnight or during the weekend*)
- Reading code and writing code changes and/or modifying the monitoring set-up where necessary
- Knowing and understanding how to navigate the pipelines and documentation.
- Following SOPs to contact other teams and data providers when data is incorrect or not received on time
- Communicating outages with the end users of a pipeline
- Contributing monitoring tooling improvements (where feasible).
- Comfortable reading and writing code in SQL, Python, Pyspark and Java.
- Basic understanding of Spark and familiar/interested in learning the basics of tuning Spark jobs.
- Practical experience with performing root cause analysis and documenting lessons learned from production incidents (e.g. creating post-mortem reports)
- Ability to work within an agile team
- Strong written and verbal communication skills with the ability to skillfully engage with customers on complex, sensitive topics
- Strong organizational skills and attention to detail through effective prioritization
- Interim TS or above.

Few words about the projectWe’re looking for a Senior Data Scientist to join our team in Warsaw or remotely. You will join our client, one of the world’s biggest private investment companies, located in the USA. The organization is specialized in the world’s consulting services for real estate, venture capital, investments and credits.As a Senior Data Scientist, you’ll support data analysis workflows and data engineering efforts at the US-based private investment firm. You’ll work directly with other data scientists and data analysts to enable data-driven decision business processes. You’ll also cooperate with the data engineering team to design and implement common code for the data science team.
You’ll deliver solutions using:Technologies: Python, SQL, Pandas, NumPy, nltk, scikit-learn, seaborn, etcTools: Jupyter Notebook, Snowflake, DBT, Docker, Azure DevOps, Octopus Deploy & JenkinsCloud: AWS (EC2, S3, Athena), BoxBest Practices: Continuous Integration, Code Reviews, Scrum/Kanban, Python packagingThe ideal candidate will be well organized, eager to constantly improve and learn, driven and, most of all - a team player!The recruitment process is well organized - it’s one video meeting, that lasts about 1,5-2hours, and contains: 30 minutes of a talk with a Manager (algorithmic and data-science related technical questions), 30 minutes of talk with the Data Analytic (cultural fit interview), 30 minutes of talk with the Senior Software Developer (programming questions). Then you will receive the decision. There is a plan to work at least 4 hours a day overlap with the USA East Coast timezone, so your work should be organized in hours like 10:00-18:00 or 11:00-19:00 PL timezone.You will be responsible for:
- Extracting actionable insights from broad, open-ended questions to influence investment decisions
- Developing common data analysis workflows using NLP (sentiment analysis), Geospatial analysis and customer segmentation methods
- Designing a common library for other data scientists that includes storage connectors, proxy authentication, secrets management, standard data visualization methods, and more
- Designing and implementing ML/DL models for text and customer segmentation data
- Designing and building AWS infrastructure together with the Data Engineering team
- Developing data technology stack including API services and ETL pipelines,
- Designing datasets and schemes for consistency and easy access
Requirements
- At least 5 years of professional experience in data-related roles or software engineering
- Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar
- Excellent command of spoken and written English, at least C1
- Expertise in Python and SQL
- Proficiency in statistics and machine learning, as well as Python libraries like Pandas, NumPy, matplotlib, seaborn, scikit-learn, Keras, etc
- Excellent understanding of natural language processing (NLP)
- Good understanding of geospatial analysis and data formats
- Experience with AWS EC2, S3 and Athena
- Ability to use Docker and create Dockerfiles
- Ability to take an ambiguous analysis question and run with it independently
- Creative problem-solving skills
- Great customer service and troubleshooting skillsNice to have:
- Experience in designing and implementing REST API services
- Experience in building ETL processes and data pipelines with platforms like Airflow or AWS services
- Familiarity with Azure DevOps stack
- Knowledge of Google Maps, Open Street Map or TomTom APIs
- Experience with ArcGis ecosystem
- Experience in operating within a secure networking environment, like a corporate proxy
- Perfectly, if you have one month of notice period, or you are available ASAP, but we can also wait for you longer.
Benefits
What do we offer?
- Flexible working hours and remote work possibility
- Multisport card
- Private medical care
- In-house workshops and tech talks
- Free access to the best tools and softwares to develop your skills and work effectively
- Comfortable office in central Warsaw equipped with all the necessary tools for comfortable work (Macbook Pro, external screen, ergonomic chairs) - if working on site
Sounds like a perfect place for you? Don’t hesitate to click apply and submit your application today!
Check our place: fb /instagram
< class="h1">Description
< class="h4" id="Responsibilities" style="margin: 1.357em 0px 0px; padding: 0px; font-style: inherit; line-height: 1.428; font-weight: 600; letter-spacing: -0.003em; text-transform: none;">Responsibilities
< class="h2" id="About-Anant" style="margin: 1.8em 0px 0px; padding: 0px; font-style: inherit; line-height: 1.2; font-weight: 500; letter-spacing: -0.008em; text-transform: none; border-bottom-color: #cccccc;">About Anant
This is a remote position.
< style="margin: 0px; padding: 0px;"> < class="highlighter-context page view" id="content" style="margin: 0px; padding: 0px; clear: none !important; position: static !important;"> < class="_1bsb1osq _19pkidpf _2hwx1wug _otyridpf _18u01wug" style="margin: 0px auto; padding: 0px; width: 1446px;"> < class="wiki-content css-b7ln87 e4p5jys0" id="main-content" style="margin: 0px; padding: 0px;"> < class="renderer-overrides" style="margin: 0px; padding: 0px;"> < class="ak-renderer-wrapper css-2c6ch1" style="margin: 0px; padding: 0px; position: relative; width: 1446px;"> < class="css-1jgf91b" style="margin: 0px auto; padding: 0px 32px; max-width: 760px; white-space: pre-wrap;"> < class="ak-renderer-document" style="margin: 0px; padding: 0px;"> < class="h4" id="Overview" style="margin: 0px; padding: 0px; font-style: inherit; line-height: 1.428; font-weight: 600; letter-spacing: -0.003em; text-transform: none;">OverviewAnant is a destination employer for high-performing, erse, global talent. Our Data Engineers support the development, operation, and maintenance of real-time data processing. They oversee and deliver the success of client and internal projects. The Data Engineer will not only support our internal team, but will also participate in client project work including design of novel systems, debugging performance degradations and read/write latencies, audits, monitoring, and health checks. An ideal Data Engineering candidate will have experience supporting rollout of migration tooling through client environments by troubleshooting GKE, Airflow, Dataproc, DataStax Enterprise, and DataStax Astra. Other candidates will gain experience using these tools.
We are looking for a Data Engineer to join our team immediately. We look for the best and brightest and those willing to learn.
< class="h4" id="Responsibilities" style="margin: 1.357em 0px 0px; padding: 0px; font-style: inherit; line-height: 1.428; font-weight: 600; letter-spacing: -0.003em; text-transform: none;">Responsibilities
-
Work with multiple teams and multiple projects (e.g., application, infrastructure, cloud, etc.) to
-
Complete requests (adding new or decommissioning existing clusters)
-
Debug and resolve issues
-
-
Utilize project management software (e.g., Jira) to log time and resolve tickets
-
Create and update SOP’s, Runbooks, issue reports, and other documentation as required
-
Consult on client projects, maintain client confidentiality and protect client operations by keeping information confidential
-
Contribute to team effort by using effective communication skills, being a self-starter, and taking responsibility for deliverables
Required Qualifications
-
Certifications in Spark
-
5+ years of relevant software design and development or certifications in Cassandra / DSE
-
5+ years of relevant software design and development or certifications in Google Cloud Platform (GCP)
-
At least ONE of the following:
-
3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE)
-
5+ years of relevant software design and development in Cassandra (Astra)
-
2+ years of relevant software design and development including Airflow
-
Additional Qualifications
-
Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure
-
2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc.
-
3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm
-
5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform
-
BS degree in Computer Science or related technical field involving coding, or equivalent practical experience
-
Ability to troubleshoot, debug and optimize code
-
Ability to identify and automate routine tasks
-
Incident management and root cause analysis
-
Strong organizational, time management, and detail skills.
-
Strong communication and interpersonal skills, able to comfortably and pleasantly deal with a variety of people
Hard Skills
-
Troubleshoot and support rollout of tooling and services that use Airflow (on K8s), Spark (managed), DataStax Enterprise, and DataStax Astra,
-
Create, troubleshoot, and refactor Python DAGs,
-
Create and deploy infrastructure as code via Ansible and Terraform,
-
Demonstrate familiarity with creating and destroying resources on GCP, including GCP monitoring dashboards,
-
Demonstrate an aptitude for RCA and troubleshooting code and systems integration issues, and
-
Familiarity with Scala, Python, and Java.
-
Conduct rapid POC Development and be able to transfer knowledge to others
Soft Skills
-
Demonstrate a passion for excellence in work product and customer delivery
-
Create and deliver live and recorded demos for customers and internal stakeholders
-
Familiarity with the enterprise data platform ecosystem
-
Continuous learning mindset
Working at Anant
-
Anant performs business around the clock, but some availability during during US Eastern Time business hours is important.
-
Anant is a 100% remote workplace.
-
Anant is currently looking to hire part time, with future full-time work available.
< class="h2" id="About-Anant" style="margin: 1.8em 0px 0px; padding: 0px; font-style: inherit; line-height: 1.2; font-weight: 500; letter-spacing: -0.008em; text-transform: none; border-bottom-color: #cccccc;">About Anant
Anant is working to become the authoritative market leader in business platforms. Most technology leaders have a hard time retaining the experts to help them build and manage global data platforms because of the high costs of specialized talent. We created a training program for client teams and a network of trained specialists on our framework who are available on a full, part, or on a project by project basis.
Anant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
< class="css-1baxzsz e3p9ckn0" id="comment-container" style="margin: 0px; padding: 0px; width: 50px; left: 1153px; top: 192px; max-height: 50px;"> < class="_19pkys9h _2hwx1wug _otyr1ylp _18u01wug _1bsb1osq _p12fukw8" style="margin: 5pc auto 40px; padding: 0px; max-width: 760px; width: 760px;"> < style="margin: 0px; padding: 0px;"> < class="css-iucoak-container" style="padding: 0px; position: relative; "> < class="labels__control css-1vk57ay-control" style="margin: 0px; padding: 0px; background-color: white; border-radius: 3px; cursor: default; min-height: 40px; position: relative; outline: 0px !important; border: 2px solid white;"> < class="labels__value-container labels__value-container--is-multi css-f9obad" style="margin: 0px; padding: 4px; position: relative; overflow: hidden;"> < class="css-yr9yzn" style="margin: 2px; padding: 2px 0px; visibility: visible;"> < class="labels__input" style="margin: 0px; padding: 0px; display: inline-block;"> < style="margin: 0px; padding: 0px; top: 0px; left: 0px; visibility: hidden; height: 0px; overflow: scroll; white-space: pre; font-weight: 400; font-style: normal; letter-spacing: normal; text-transform: none;"> < class="_syaz9sh9" style="margin: 0px; padding: 0px; color: #6b778c;"> < class="_1e0c1txw _4cvr1h6o" style="margin: 0px; padding: 0px;">< class="h1">Skills
Overview Anant is a destination employer for high-performing, erse, global talent. Our Data Engineers support the development, operation, and maintenance of real-time data processing. They oversee and deliver the success of client and internal projects. The Data Engineer will not only support our internal team, but will also participate in client project work including design of novel systems, debugging performance degradations and read/write latencies, audits, monitoring, and health checks. An ideal Data Engineering candidate will have experience supporting rollout of migration tooling through client environments by troubleshooting GKE, Airflow, Dataproc, DataStax Enterprise, and DataStax Astra. Other candidates will gain experience using these tools. We are looking for a Data Engineer to join our team immediately. We look for the best and brightest and those willing to learn. Responsibilities Work with multiple teams and multiple projects (e.g., application, infrastructure, cloud, etc.) to Complete requests (adding new or decommissioning existing clusters) Debug and resolve issues Utilize project management software (e.g., Jira) to log time and resolve tickets Create and update SOP’s, Runbooks, issue reports, and other documentation as required Consult on client projects, maintain client confidentiality and protect client operations by keeping information confidential Contribute to team effort by using effective communication skills, being a self-starter, and taking responsibility for deliverables Required Qualifications Certifications in Spark 5+ years of relevant software design and development or certifications in Cassandra / DSE 5+ years of relevant software design and development or certifications in Google Cloud Platform (GCP) At least ONE of the following: 3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE) 5+ years of relevant software design and development in Cassandra (Astra) 2+ years of relevant software design and development including Airflow Additional Qualifications Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure 2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc. 3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm 5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform BS degree in Computer Science or related technical field involving coding, or equivalent practical experience Ability to troubleshoot, debug and optimize code Ability to identify and automate routine tasks Incident management and root cause analysis Strong organizational, time management, and detail skills. Strong communication and interpersonal skills, able to comfortably and pleasantly deal with a variety of people Hard Skills Troubleshoot and support rollout of tooling and services that use Airflow (on K8s), Spark (managed), DataStax Enterprise, and DataStax Astra, Create, troubleshoot, and refactor Python DAGs, Create and deploy infrastructure as code via Ansible and Terraform, Demonstrate familiarity with creating and destroying resources on GCP, including GCP monitoring dashboards, Demonstrate an aptitude for RCA and troubleshooting code and systems integration issues, and Familiarity with Scala, Python, and Java. Conduct rapid POC Development and be able to transfer knowledge to others Soft Skills Demonstrate a passion for excellence in work product and customer delivery Create and deliver live and recorded demos for customers and internal stakeholders Familiarity with the enterprise data platform ecosystem Continuous learning mindset Working at Anant Anant performs business around the clock, but some availability during during US Eastern Time business hours is important. Anant is a 100% remote workplace. Anant is currently looking to hire part time, with future full-time work available. About Anant Anant is working to become the authoritative market leader in business platforms. Most technology leaders have a hard time retaining the experts to help them build and manage global data platforms because of the high costs of specialized talent. We created a training program for client teams and a network of trained specialists on our framework who are available on a full, part, or on a project by project basis. Anant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
< class="h1">Experience5+ years


africadataeurope
Prismic is a rapidly growing website builder and platform. We use data to determine the direction of the business and need help from a creative and flexible analytics engineer (you will be our first analytics engineer and will report to our head of Engineering).
You are motivated to use your creativity to enable us to make strategic decisions and you will challenge us in the ways we need to grow, with a say in our prioritisation, helping us to ensure Acquisition, Revenue and Product attend to all of the opportunities on the table. You are central to our future growth.
Prismic has already built a Customer Data Platform (using segment.com), and we're ready to expand and move to a full fledged data warehouse architecture (including dbt, snowflake and more). You will collaborate with our platform team to build data pipelines and warehouse architecture, to process data and to build and deploy infrastructure that supports all of our data needs.
We have a strong desire to gravitate our architecture toward the state of the art, and will rely on you to help us get there.
What will you be doing?🔧
Your insights will help us improve our technical stack and also combine effectively with Usability Research. Your business savviness will be fueled by automated data models and confirmed through dashboard communications to keep us all aware of the status of our data and avoiding compromises in data quality.
Growth will be based on pinpointing with confidence our challenges and opportunities:
– Marketing: Helping in identifying the right audience, the most effective channel to communicate our promise, and properly onboard our prospect into our product. Evaluate the impact of marketing initiatives on conversion and word of mouth.
– Revenue: help lead growth hypotheses and sales decisions by detecting decision makers, and communicate macro trends and patterns to drive total customer success
– Product: helping our outcome-oriented product team to set their goals, evaluate the delivered value, improve user satisfaction, reduce churn, improve usage and upsell.
You are motivated to use your creativity to enable us to make strategic decisions and you will challenge us in the ways we need to grow, with a say in our prioritisation, helping us to ensure Acquisition, Revenue and Product attend to all of the opportunities on the table. You are central to our future growth.
You’re a master communicator who wants broad technical and business exposure. You’ll help to enable our business teams, our most important stakeholders, to understand and satisfy their own needs through analytics. In turn, this will help support our customers and their growing set of needs, and thus help us to grow and run our business. Your fresh ideas will be warmly welcomed.
Are you the one? 🧠
– 3+ years of experience as a data analyst, analytics engineer or data engineer, mastery of data libraries
– Coding (Python; SQL)
– Customer data (dbt)
– Leadership (rapid and autonomous assumption of new topics or tools among many teams in a new frontier)
– Communication (understanding and reflection of business initiatives, including transforming needs into insights)
– Data visualisation (analysis, synthesis, critical thinking, presentation, storytelling)
It would be super cool (but not required) if you had experience with SaaS solutions like Segment, Amplitude, Hubspot, Google Tag Manager, and also AWS Services like Athena, Step Functions or even Lambda.
If you have a tester’s mindset, all the better. A dream candidate would be a strong contributor and have all of these.
What are the perks? 🎉
– Latest Macbook;
– A budget for you to equip your home-office setup;
– English classes for all levels;
– Solving challenging problems, while building cutting-edge technology;
– Working in a super culturally-erse team, with fun and curious folks.
(also other benefits, that may depend on the country you’re based in)
When you come to the office, indulge!
– Healthy snacks and drinks;
– Yoga classes 3x/week.
Afraid of missing out if you’re remote? 🌍 Worry not!
You get the chance to visit us every once in a while and spend some days at the office, in Paris;
We have virtual initiatives and events, for us to stay connected with each other and be able to have the precious water-cooler conversation.
We also have regular global meetings, where every team member is free to raise their hand and discuss any topic with the whole company - we do our best to nurture a relaxed and informal atmosphere, where you can have the conditions to feel supported, thrive at your job and keep learning.
So, no matter where you are, it’s important for us to make you a part of that culture.



data🇺🇸usa only
< class="h3">Company Description

AETOS LLC is a Minority Owned CVE Certified Service Disabled Veteran Owned Small Business (SDVOSB) providing information technology solutions focused on building a business that is customer-centered and performance-oriented. At Aetos, we specialize in developing IT solutions to optimize functionality and efficiencies for government and commercial clients to meet their business needs.
< class="h3">Job DescriptionOur ideal candidate will have a proven track record in leading and delivering Azure Data Analytics solutions with Enterprise level organizations. We are seeking someone who can enable advance analytic capabilities preparing a data lake by ingesting raw data from authoritative sources using defined patterns and curating the same to publish data products, using cloud native tools. You will be highly proficient working with business stakeholders and directing project teams and will have direct experience of a range of BI Tools and Platforms.
Candidate will have at least 2 years of Azure experience and a minimum of 7 years’ total experience. A proven track record designing & developing enterprise level data platforms. You will manage your own as well as others’ performance to deliver high quality, highly performant solutions to our customer.
You will be responsible for the design of solutions which will include:
- Solution design using Microsoft Azure services and related tools.
- Design of enterprise data models and Data solutions.
- Specification of ELT (Extract, load and transform) pipelines, data integration and data migration design, with low code no code tools.
- Design & implementation of Master data management solutions.
- Specification of Data Quality Management methodology and supporting technology tools.
- Working within a project management/agile delivery methodology in a leading role as part of a wider team.
- Deal with other stakeholders/end users in the software development lifecycle – PMs, BAs, testing etc.
Required Skills and Experience
- Bachelor's Degree in computer science, or other engineering or technical discipline is required.
- 10 years of experience
- Direct experience of solution shaping and architecture development during planning & delivery including excellent documentation skills.
- Strong evidence of Data modeling, mapping data flows and data design.
- ELT experience.
- Hands-on experience solutioning and implementing analytical capabilities using the Azure Data Analytics platform including, Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake.
- Experience in the design of reporting & data visualization solutions such as Power BI or Tableau.
- Experience with Master Data Management & Data Quality tools.
- Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation pipelines.
- Microsoft Azure Data Platform certification ideal (DP-200, DP201)
All your information will be kept confidential according to EEO guidelines.

We are looking for a Senior Data Engineer with experience working with complex systems, managing and deploying. Expert level Python programming and debugging skills. Familiarity with REST APIs, Cloud Infrastructure, CI/CD, Git. familiarity with UI development, frameworks (React/Angular, Flask, PHP, JS, etc.).
Skills required to contribute:
Primary Skills :
- Python (for writing business logic),
- Flask (API framework behind mind meld) ,
- AIO http (Python based async web framework),
- OpenShift CAE (Red hat-based Kubernetes platform),
- Redis (for Caching), Snowflake (for storing data),
Secondary Skills :
- RASA (NLP framework),
- Docker (container orchestrator),
- Elasticsearch (for telemetry),
- Grafana (for displaying stats),
- Prometheus (for monitoring services)
- Experience - 4 + Years in relevant technologies.


dataspain
Who we are
We're a global marketing service provider and we specialize in affiliate marketing & publishing. We are digital natives, data obsessed and focused on measurable outcomes. Some of the most talented iniduals you'll ever meet, all with one thing in common: doing great work, and growing as a team.
Our value lies in Diversity, Equity, Inclusion and Belonging. We strongly believe in equality and stand against all kinds of discrimination. We dare to be unapologetically ourselves. Come join a team of explorers who are motivated by growth, and driven by results.
What you'll do
The successful candidate will be responsible for the development of machine learning products in agreement with the global R&D roadmap including data gathering, data preparation, model development and evaluation following standard ML good practices. In collaboration with SEO and conversion experts, he will leverage big data to pioneer innovation in SEO, conversion optimisation or web development. This position reports to the Data Science Team Lead.
- Work closely with the rest of the R&D team to develop or adapt AI models that predict desired output leveraging big transversal and longitudinal data;
- Follow an agile and flexible team oragnisation in order to deliver optimal performance for each project;
- You are a Data scientist who can:o Work closely with the stakeholders to define model requirements and evaluate results; o Collaborate with internal team members (SEO, Conversion, Data Engineers) on best practices, prioritization and developing end-to-end ML products;o Provide a very good understanding of methodology used and provide required training to non-expert stakeholders; o Deliver exceptional work and innovative models; on strategy and on-time;
What you'll bring
- Master's degree or equivalent in data science, or proven equivalent experience in a technical organization or related field;
- Experience in Machine learning is a must in at least one of the following applications (Reinforcement learning, clustering, supervised learning regression, classification);
- Experience in python programming;
- Attention to details and ability to multi-task in a challenging and fast environment;
- Strong analytic skills, able to analyze and interpret data and propose actionable insights;
- Strategic thinker, able to anticipate, think critically, evaluate, and act;
- A communicator and relationship builder, able to navigate effectively across organizations and culture.
Strong consideration will be provided to candidates that also offer the following:
- Experience with SQL or other DB queries
- Experience with Agile Project Management tools such as Jira;
- Prior experience with SCRUM/Agile methodologies
- Experience with NLP related Machine Learning
- Able to work with good code practices, Git, Unit testing
What's in it for you
- Relocation & soft landing for you and your family (applicable in Spain, if you are moving from a different city/country)
- Attractive salary
- Competitive private health & life insurance package
- Flexible working hours and remote-friendly tools and methodologies to stay connected
- Workplace perks such as coffee, fresh fruit, bread, sweets and drinks provided daily
- In addition to regular leave yearly, six weeks' paid leave for every four years of service
- Be part of a multicultural environment
- Wellbeing programme
- Different employee events throughout the year and team building activities
- Career Development training and programs to help you grow!

Updated almost 3 years ago
RSS
More Categories