Cutting-edge platform designed to optimize storage and analytics over Data Lakehouse architectures. Our mission is to boost efficiency and query speed on massive datasets, especially within distributed cloud environments.
Design, develop, and optimize our data indexing technologies using Java,Scala, and Rust.* Extend support for open table formats (e.g., Apache Iceberg).* Ensure product quality by fixing bugs, building robust test cases, andcontributing to resilient frameworks.* Collaborate across multiple areas of the startup, wearing multiple hatswhen needed.
analyze big data.
exceptional candidates).
* Experience:o Minimum 3 years of professional software development experience.* Education:o Bachelor's, Master's, or PhD in Computer Science, SoftwareEngineering, or related fields.* Technical Skills:o Strong foundation in algorithms and data structures.o In-depth understanding of database internals.o Experience with Data Lakehouse formats (Delta Lake, Hudi,Iceberg) is highly desirable.o Knowledge of cloud storage I/O optimization is a strong plus.o Advanced programming skills in Java, Scala, and Rust (otherlanguages are a plus).o Familiarity with query engines like Spark, Trino, or DataFusion.* Soft Skills:o Fluent English (working language).o Strong organizational skills, autonomy, and a proactive mindset.o Curiosity and eagerness to learn and share knowledge.o Collaborative team spirit and startup mentality.