Larus is a big data company that offers innovative solutions designed to provide clients with the information they need to optimize their decision-making process, boost profits and stay competitive.
We have in-depth experience in highly scalable and high-performance big data analytics applications that transform complex and unorganized data into clear and actionable insights.We build data management and analytics platforms using open source as well as cloud-based solutions.
At Larus, we embrace a culture of experimentation and innovation focusing on attracting people who are committed to doing their best, constantly striving for improvement and learning.
Headquartered in Venice, we are growing and opening new branches in Pescara, Rome and Milan.
We are actively looking for Data Engineers to join our teams in creating Larus next generation of data products and capabilities. We are looking for both senior developers and junior talent
For junior talent we provide a training plan in our academy.
We ask for availability to travel on the territory ( Rome and Milan mainly) alternating days of presence by the customer with remote work.
You will be involved in a variety of exciting and challenging projects.
As a Data Engineer you will design robust and scalable data-driven solutions and data pipeline frameworks to automate the ingestion, processing and delivery of both structured and unstructured batch and real-time streaming data.
We work on international projects, so if you know English it’s perfect, if you don’t know it we’ll help you learn it.
- Data engineering or data management experience
- Demonstrated experience with management, optimizing performance and processing large volumes of data
- 2+ years of experience with one or more modern programming language (Python, Java, JDBC, JSON, API, Rest services)
- Working directly on Big Data technologies, such as Spark, Hadoop, Kafka.
- Experience with Linux/UNIX to process large data sets
- Experience with NoSQL technologies
- Experience with Agile, DevOps, CI/CD frameworks
- Experience working on a Data Lake project or building a Machine Learning platform would be highly regarded
- Experience with developing end-to-end data pipelines in large cloud-compute infrastructure solutions such as Azure, AWS and Google is a plus
- Being curious about trends and emerging technologies in the Data ecosystem
- You are passionate about a culture of learning and teaching.
- You love challenging yourself to constantly improve, participating in user communities and sharing your knowledge with your teammates
- Strong written and verbal communication skills
- Bachelor degree/Master’s Degree in Informatics Engineering, Computer Science or related field or equivalent experience
- Experience with big data (Hadoop, Spark, Kafka)
- Experience with data structures
- Great work environment
- Continuous education and training certification opportunity, also with pair programming sessions
- International work environment, we are open to every culture.
- A collaborative environment where everyone is given the space to grow personally and professionally
- We love open-source and we encourage everyone to contribute to projects or attend events. We own the “Neo4j – Italy User Group” Meetup.
- Opportunity to attend conferences as a speaker
- Flexibility to structure your work in order to achieve results and give you time to enjoy and enrich your life.
- The best hardware and software tools
- We pay a lot of attention to the quality of the projects we undertake. We don’t do “body rental” of “resources”
- Permanent contract with flexibility to work remotely
If you are interested in working with us, please send your CV to email@example.com