Larus is a big data company that offers innovative solutions designed to provide clients with the information they need to optimize their decision-making process, boost profits and stay competitive.
We have in-depth experience in highly scalable and high-performance big data analytics applications that transform complex and unorganized data into clear and actionable insights.
At Larus, we embrace a culture of experimentation and innovation focusing on attracting people who are committed to doing their best, constantly striving for improvement and learning.
Headquartered in Venice, we are growing and opening new branches in Pescara, Rome and Milan.
For our offices in Pescara, Rome and Milan we are actively looking for Java Developer to join our teams in creating Larus next generation of data products and capabilities.
We ask for availability to travel on the territory (Rome and Milan mainly) alternating days of presence by the customer with remote work.
Data-driven apps help enterprises to make faster, smarter and informed decision making and provide solutions that enable the organization to achieve its goals.
As a Java Developer, you will be responsible for the design, development and test of scalable data-driven solutions and next generation of products and data pipeline between enterprise systems and analytics platform.
Continuously improving data operations: automating manual processes, optimizing data delivery, re-designing architecture for greater scalability,
You will ensure data quality checks and data lineage implemented in each hop of the data pipeline
You will be in tune with emerging trends Big data, NoSQL and cloud technologies and participate in evaluation of new technologies.
You will be involved in a variety of exciting and challenging projects.
We work on international projects, so if you know English it’s perfect, if you don’t know it we’ll help you learn it.
For junior talent we provide a training plan in our Academy.
- Experience with one or more modern programming language (Java, Scala, Kotlin)
- Experience with management, optimizing performance and processing large volumes of data
- Working directly on Big Data technologies, such as Spark, Hadoop
- Experience with Linux/UNIX to process large data sets
- Experience with NoSQL technologies
- Experience with Agile, DevOps, CI/CD frameworks
- Experience with developing end-to-end data pipelines in large cloud-compute infrastructure solutions such as Azure, AWS and Google is a plus
- Being curious about trends and emerging technologies in the Data ecosystem
- You are passionate about a culture of learning and teaching.
- You love challenging yourself to constantly improve, participating in user communities and sharing your knowledge with your teammates
- Strong written and verbal communication skills
- Bachelor degree/Master’s Degree in Informatics Engineering, Computer Science or related field or equivalent experience
- Experience with big data (Hadoop, Spark, Kafka)
- Experience with data structures
- Great work environment
- Continuous education and training certification opportunity, also with pair programming sessions
- International work environment, we are open to every culture.
- A collaborative environment where everyone is given the space to grow personally and professionally
- We love open-source and we encourage everyone to contribute to projects or attend events. We own the “Neo4j – Italy User Group” Meetup.
- Opportunity to attend conferences as a speaker
- Flexibility to structure your work in order to achieve results and give you time to enjoy and enrich your life.
- The best hardware and software tools
- We pay a lot of attention to the quality of the projects we undertake. We don’t do “body rental” of “resources”
- Permanent contract with flexibility to work remotely
If you are interested in working with us, please send your CV to email@example.com