Overview
IntroductionA career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You’ll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from our strategic partners, robust IBM technology, and Red Hat, you’ll have the tools to drive meaningful change and accelerate client impact. At IBM Consulting, curiosity fuels success. You’ll be encouraged to challenge the norm, explore new ideas, and create innovative solutions that deliver real results. Our culture of growth and empathy focuses on your long-term career development while valuing your unique skills and experiences.
Your Role And Responsibilities
As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for:
- Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients
- Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects
- Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability
- Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery)
- Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product
Master's Degree
Required Technical And Professional Expertise
- SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language.
- Data pipeline, data streaming and workflow management tools: Dataflow, Pub Sub, Hadoop, spark-streaming
- Version control system: GIT & Preferable knowledge of Infrastructure as Code: Terraform.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions.
- Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Working knowledge of message queuing, stream processing, and highly scalable data stores.
- Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.