Ad is expired on 20/11/2022.

  • Company NameBell Canada
  • Job TypeFull-Time
  • AddressMississauga, ON

As a member of the Enterprise Data Platforms team, reporting to the Network Topology and GIS Manager, the Big Data DevOps will play a leading role in the development of new products, capabilities, and standardized practices using various Big Data technologies. Working closely with our business partners, this person will be part of a team that advocates the use of Big Data technologies to solve business problems, and be a thought-partner in the Big Data analytics space.

Primary Responsibilities:

Participate in all aspects of Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
Develop standardized practices for delivering new products and capabilities, including data acquisition, transformation, and analysis
Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)
Create formal written deliverables and other documentation, and ensure designs, code, and documentation are aligned with enterprise direction, principles, and standards
Train and mentor teams in the use of the fundamental components in various on prem and cloud technology stacks including Hadoop, ELK, Kafka and Google Cloud Platforms.
Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on Big Data solutions
Troubleshoot production issues within the various data platform environments
Performance tuning of a Hadoop and / or GCP data processes and applications
Strong communication, technology awareness and capability to interact work with senior technology leaders is a must
Good knowledge on Agile Methodology and the Scrum process
Delivery of high-quality work, on time and with little supervision
Critical Thinking/Analytic abilities

Basic Qualifications:

Bachelor in Computer Science, Management Information Systems, or Computer Information Systems is required. Bachelor in Engineering with relevant experience also considered.
Minimum of 4 years of software development experience
Minimum of 2 years of building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Flume, Kafka, StreamSets, HBase, etc.
Minimum of 2 years of development with Scala / Spark, Spark Streaming, Java, Python, HiveQL
Minimum 4 years understanding of traditional ETL tools & Data Warehousing architecture.
Strong personal leadership and collaborative skills, combined with comprehensive, practical experience and knowledge in end-to-end delivery of Big Data solutions.
Experience in MySQL, Oracle, MS SQL Server and other RDBMS is a plus.
Must be proficient in SQL/HiveQL
Hands on expertise in Linux/Unix and scripting skills are required.
Excellent analytical skills. Ability to quickly learn, use, and understand the business context of a new dataset.

Preferred Qualifications:

Experience developing applications on the Google Cloud Platform, leveraging modules like GKE, BigQuery, Cloud Bigtable. Experience with Apache Airflow and Databricks, a plus
Experience working with a graph database (JanusGraph, Neo4j, DSE, etc.) and building graph queries (Gremlin, Cypher, etc.) and algorithms.
Strong in-memory database and Apache Hadoop distribution knowledge (e.g. HDFS, MapReduce, Hive, Oozie, Spark)
Past experience using Maven, Git, Jenkins, Se, Ansible or other continuous integration tools is a plus
Proficiency with SQL, NoSQL, relational database design and methods
Deep understanding of techniques used in creating and serving schemas at the time of consumption
Identify requirements to apply design patterns like self-documenting data vs. schema-on-read.
Played a leading role in the delivery of multiple end-to-end projects using Hadoop as the data platform.
Successful track record in solution development and growing technology partnerships
Ability to clearly communicate complex technical ideas, regardless of the technical capacity of the audience.
Strong inter-personal and communication skills including written, verbal, and technology illustrations.
Experience working with multiple clients and projects at a time.
Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).
Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum).
Demonstrated capability with business development in big data infrastructure business.

#EmployeeReferralProgram

3 months ago, # 17, 15, Edit