About AllTrails
AllTrails is the world’s most popular and trusted platform for outdoor exploration. We connect people to the outdoors, help them discover new places, and elevate their experiences on the trail. With the most comprehensive collection of trails in the world, AllTrails supports inclusive access to nature for a global community of millions of trailgoers. Every day, we solve incredibly hard problems to get more people outside, for their wellbeing and the collective care of the natural world. Join us!
This is a U.S.-based remote position. San Francisco Bay Area employees are highly encouraged to come into the office one day a week.
What You’ll Be Doing:Work cross-functionally to ensure data scientists have access to clean, reliable, and secure data, the backbone for new algorithmic product featuresBuild, deploy, and orchestrate large-scale batch and stream data pipelines to transform and move data to/from our data warehouse and other systemsDeliver scalable, testable, maintainable, and high-quality codeInvestigate, test-for, monitor, and alert on inconsistencies in our data, data systems, or processing costsCreate tools to improve data and model discoverability and documentationEnsure data collection and storage adheres to GDPR and other privacy and legal compliance requirementsUphold best data-quality standards and practices, promoting such knowledge throughout the organizationDeploy and build systems that enable machine learning and artificial intelligence product solutionsMentoring others on best industry practicesRequirements:Minimum of 6 years of experience working in data engineeringExpertise both in using SQL and Python for data cleansing, transformation, modeling, pipelining, etc.Proficient in working with other stakeholders and converting requirements into detailed technical specifications; owning and leading projects from inception to completionProficiency in working with high volume datasets in SQL-based warehouses such as BigQueryProficiency with parallelized python-based data processing frameworks such as Google Dataflow (Apache Beam), Apache Spark, etc.Experience using ELT tools like Dataform or dbtProfessional experience maintaining data systems in GCP and AWSDeep understanding of data modeling, access, storage, caching, replication, and optimization techniquesExperienced with orchestrating data pipelines and Kubernetes-based jobs with Apache AirflowUnderstanding of the software development lifecycle and CI/CDMonitoring and metrics-gathering (e.g. Datadog, NewRelic, Cloudwatch, etc)Willingness to participate in a weekly on-call support rotation - currently the rotation is monthlyProficiency with git and working collaboratively in a shared codebaseExcellent documentation skillsSelf motivation and a deep sense of pride in your workPassion for the outdoorsComfort with ambiguity, and an instinct for moving quicklyHumility, empathy and open-mindedness - no egosAI Native: You naturally incorporate AI tools to enhance your work. You’re comfortable writing prompts, evaluating AI outputs, and enjoy experimenting with new ways to boost creativity, productivity, and decision-making.Nature celebrates you just the way you are and so do we! At AllTrails we’re passionate about nurturing an inclusive workplace that values diversity. It’s no secret that companies that are diverse in background, age, gender identity, race, sexual orientation, physical or mental ability, ethnicity, and perspective are proven to be more successful. We’re focused on creating an environment where everyone can do their best work and thrive.
AllTrails participates in the E-Verify program for all remote locations.
Create a Job Alert
Interested in building your career at AllTrails? Get future opportunities sent straight to your email.
Create alert