Company Name:

Webbmason Analytics

Location:

Raleigh, NC

Approximate Salary:

Not Specified

Date Posted:

April 9, 2019

Data Engineer

The WebbMason Analytics Data Engineer helps our clients turn data into knowledge so they can make better decisions faster. The data engineer will work with clients and other team members to analyze and help define requirements, mine and analyze data, integrate data from a variety of sources, and participate in the design and implementation of reports, algorithms, and other data processing and analysis techniques. The most fundamental role of the data engineer to is deliver high-quality data pipelines for producing analytics-ready datasets.


Data Engineer Responsibilities:


  • Deliver end-to-end analytics projects, including data ingest, data transformation, data science, and data visualization
  • Design and deploy databases and data pipelines to support analytics projects
  • Clearly document datasets, solutions, findings and recommendations to be shared internally & externally
  • Learn and apply tools and technologies proficiently, including:
  • Languages: SQL (standard and DB-specific), Python, R, Spark/Scala, Bash
  • Frameworks: Hadoop, Spark, AWS
  • Tools/Products: Data Science Studio, Alteryx, Jupyter, RStudio, Tableau, PowerBI
  • Build compelling visualizations and dashboards that address the analytic needs of the end-user/customer
  • Performance optimization for queries and dashboards
  • Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions
  • Analyze client data & systems to determine whether requirements can be met
  • Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team
  • Develop and communicate solutions architectures and present solutions to both business and technical stakeholders
  • Provide end user support to other data engineers and analysts


Requirements:

  • Expertise in SQL and Python. Other programming languages (R, Scala/Spark, SAS, Java, etc.) are a plus
  • Experience with data and analytics technologies, including RDBMS, ETL, and BI
  • Experience with Hadoop or other big data technologies
  • Experience with AWS or other Cloud technologies
  • Experience with agile delivery methodologies and/or JIRA
  • Experience working on Linux command-line
  • BS or higher in related field
  • Master’s degree in related field

Apply Now