Work From Home Jobs At Canonical | Software Engineer – Data, AI/ML & Analytics
Work From Home Jobs At Canonical :-
Canonical is hiring candidates for the role of Python and Kubernetes Software Engineer – Data, AI/ML & Analytics for the Chennai, India [Work From Home] locations. The complete details about Work From Home Jobs At Canonical are as follows.
Company Name:- | Canonical |
Required Education: | Undergraduate degree |
Required Skills: | Python and Kubernetes Data, AI/ML & Analytics |
Salary Range: | Upto INR 8 LPA – 12 LPA |
Job Type: | Work From Home |
Qualifications
What Canonical is looking for in you
- Professional or academic software delivery using Python
- Exceptional academic track record from both high school and university
- Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path
- Confidence to respectfully speak up, exchange feedback, and share ideas without hesitation
- Track record of going above-and-beyond expectations to achieve outstanding results
- Passion for technology evidenced by personal projects and initiatives
- The work ethic and confidence to shine alongside motivated colleagues
- Professional written and spoken English with excellent presentation skills
- Experience with Linux (Debian or Ubuntu preferred)
- Excellent interpersonal skills, curiosity, flexibility, and accountability
- Appreciative of diversity, polite and effective in a multi-cultural, multi-national organisation
- Thoughtfulness and self-motivation
- Result-oriented, with a personal drive to meet commitments
- Ability to travel twice a year, for company events up to two weeks long
Additional skills that would be nice to have
The following skills may be helpful to you in the role, but we don’t expect everyone to bring all of them.
- Hands-on experience with machine learning libraries, or tools.
- Proven track record of building highly automated machine learning solutions for the cloud.
- Experience with container technologies (Docker, LXD, Kubernetes, etc.)
- Experience with public clouds (AWS, Azure, Google Cloud)
- Working knowledge of cloud computing
- Passionate about software quality and testing
- Experience working on an open source project
What your day will look like
- Develop your understanding of the entire Linux stack, from kernel, networking, and storage, to the application layer
- Design, build and maintain solutions that will be deployed on public and private clouds and local workstations
- Master distributed systems concepts such as observability, identity, tracing
- Work with both Kubernetes and machine-oriented open source applications
- Collaborate proactively with a distributed team of engineers, designers and product managers
- Debug issues and interact in public with upstream and Ubuntu communities
- Generate and discuss ideas, and collaborate on finding good solutions
Skills Required
As a software engineer on the team, you’ll collaborate on an end-to-end data analytics and mlops solution composed of popular, open-source, machine learning tools, such as Kubeflow, MLFlow, DVC, and Feast. You may also work on workflow, ETL, data governance and visualization tools like Apache SuperSet, dbt, and Temporal, or data warehouse solutions such as Apache Trino, or ClickHouse.
Your team will own a solution from the analytics and machine learning space, and integrate with the solutions from other teams to build the world’s best end-to-end data platform. These solutions may be run on servers or on the cloud, on machines or on Kubernetes, on developer desktops, or as web services.
Work From Home Jobs At Canonical Application Process:-
Apply Link:- Click Here To Apply (Apply before the link expires)
Note:– Only shortlisted candidates will receive the call letter for further rounds
Apply for Other Off-Campus Jobs
To apply for this job please visit boards.greenhouse.io.