Data Engineer in Risk Management at eBay

Category: Data & Analytics, Developer/Engineer, Engineering, Software Engineering
Location: Austin, Texas

At eBay, you will be part of a purpose driven community dedicated to creating a bold and versatile work environment. In eBay Payments, you will be an integral member of a growing organization that inspires passion, courage and inventiveness – creating the future of global commerce and making an important, positive impact on millions of eBay sellers and shoppers around the world. If you are looking for a special place to take your Payments career to the next level, we want to talk with you!

Risk Management is at the core of Payments done well – and we are hiring curious, driven, and courageous experts to transform our business unit to enable eBay’s next generation Payments strategy. Our focus is to ensure the integrity of our marketplace for buyers and sellers who transact with us every single day. The scope of our charter includes Risk Management Strategy, Policy, Decision Sciences, and Policy Operations.

We are looking for a highly talented and self-motivated data engineer to join our Decision Science team. Decision Science contains both data scientists and data engineers responsible for creating and implementing state of the art machine learning algorithms for fraud detection and risk assessment in support of Risk Management.  The primary responsibility of this role is to assist in algorithm deployment inside of a high throughput, low latency, big data environment.

Primary Job Responsibilities

Within the Decision Science team, the data engineer will work side by side with our data scientists to develop applications, which bring machine-learning models to life inside a production environment.  You will be leveraging core big data and deployment infrastructure to aggregate and structure data, monitor the quality of data feeds, build integration layers between data sources and machine learning models, and ultimately detect fraud in real time.  You will design, customize, and automate necessary processes building a robust data science pipeline for linking production results to our big data stack for monitoring and model retraining.  Strong self-motivation, a passion for data, and interest in high performance computing systems is the key to success in this type of role.

Required Skills and Experience:

  • Bachelor’s degree in computer science or related field, MS/PHD preferred.
  • 1+ years of related experience
  • Extensive experience with event stream data handling, Flink/Storm, Kafka, etc.
  • Extensive experience with the Hadoop ecosystem: HDFS, Spark, Hive, Spark SQL, etc.
  • Working software development experience, using one or more major programming languages, e.g. Java, Scala, etc.
  • Experience with Python.
  • Experience with containerized deployment platform. E.g. Kubernetes.
  • Experience with Service Oriented Architectures, including both building and consuming REST, XML, JSON, and WSDL/SOAP interfaces; also exchange data with NoSql databases.
  • Experience with CICD platforms.
  • Knowledge around relational database: Experience with Teradata, Oracle etc.
  • Knowledge around machine learning a plus

Basic Qualifications:

  • A keen strategic and analytical thinker with the ability to put complex ideas into clear frameworks, and use data to drive innovation.
  • Embodies the work ethic and personality that thrives in a fast-paced culture with tight deadlines, shifting priorities, and matrix managed responsibilities.
  • Seeks to identify root causes and recommends solutions to avoid repeat of problems. Shows a high level of skills in breaking down problems into their essential elements, carrying out a diagnosis and developing a solution.
  • Continuously seeks out improvement and innovation opportunities in analytic technologies that have immediate and meaningful impact on business. Provide a source of high quality ideas.
  • Collaborates with peers across the broad organization on ideation and innovation. Provide technical knowledge and thought leadership to the broad analytic community in the area of expertise.
  • Has the ability to identify and involve the right resources to accomplish projects. Knows when and where to get data and how to apply scientific and academic expertise to decisions.