On behalf of Data & Analytics , Ciklum is looking for a Lead Big Data Engineer to join the UA team on a full-time basis.
You will join a highly motivated team and will be working on a modern solution for our existing client. We are looking for technology experts who want to make an impact on new business by applying best practices and taking ownership.
Project description :
The project is about Digital Transformation to enable improved customer experience around access to all aspects of legal services the digital platform is developed for a large legal outsource company in the United States.
The platform consists of a legal workflow management application, developed using Dynamics 365, and a data platform to drive analytics and dashboards.
This platform needs to import data in batches from a number of client systems and also from the Dynamics 365 part of the application.
And therefore, data needs to be cleaned, joined, transformed and made available to Power BI for reporting.
It is a great opportunity to learn Azure best practices with the platform being 100% based in Azure cloud and work with Data Lake (Gen 2), Databricks, Data Factory, Power BI, Analysis Services, using a standard pattern ETL, data lake, data mart, OLAP model architecture.
Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
Contributes to design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple data storages
Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed
Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members
Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability
Performs development, QA, and DevOps roles as needed to ensure total end to end responsibility of solutions
Works directly with business analysts and data scientists to understand and support their use cases
Contributes to CoE activities and community building, participate in conferences, provide excellence in exercise and best practices
Helps with sales activities, customer meetings and digital services
2+ years of Big Data experience
2+ years of hands-on implementation experience working with a combination of the following technologies : Hadoop, Map Reduce, Pig, Hive, Impala, Spark, Kafka, Storm, SQL and NoSQL data warehouses such as Hbase and Cassandra
5+ years of experience coding in SQL, Java, Python , C# or Scala, with solid CS fundamentals including data structure and algorithm design
3+ years contributing to production deployments of large Back End data processing and analysis systems as a team lead
2+ years of experience in cloud data platforms Azure
Databricks hands-on experience (desire to learn)
Knowledge of BI reports and dashboards design and implementation (Qlik Sense, Power BI, Tableau or DOMO visualization tools)
Knowledge of SQL and MPP databases (e.g. Vertica, Netezza, Greenplum, Aster Data)
Knowledge of professional software engineering best practices for the full software
Knowledge of Data Warehousing, design, implementation and optimization
Knowledge of Data Quality testing, automation and results visualization
Knowledge of development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
Experience participating in an Agile software development team, e.g. SCRUM
Experience designing, documenting, and defending designs for key components in large distributed computing systems
A consistent track record of delivering exceptionally high quality software on large, complex, cross-functional projects
Demonstrated ability to learn new technologies quickly and independently
Ability to handle multiple competing priorities in a fast-paced environment
Undergraduate degree in Computer Science or Engineering from a top CS program required. Masters preferred
Experience with supporting data scientists and complex statistical use cases highly desirable
Understanding of cloud infrastructure design and implementation
Experience in data science and machine learning
Experience in Back End development and deployment
Experience in CI / CD configuration
Good knowledge of data analysis in enterprises
Curious mind and willingness to work with client in consultivitive manner to find areas to improve
Upper-Intermediate or Advanced English
Good analytical skills
Good team player, motivated to develop and solve complex tasks
Self-motivated, self-disciplined and result-oriented
Strong attention to details and accuracy
What's in it for you
A Centre of Excellence is ultimately a community that allows you to improve yourself and have fun. Our centres of excellence (CoE) bring together all Ciklumers from across the organization to share best practices, support, advice, industry knowledge and to create a strong community
Close cooperation with client
A constant flow of new projects
Dynamic and challenging tasks
Ability to influence project technologies
Projects from scratch
Team of professionals : learn from colleagues and gain recognition of your skills
European management style