
Neo BI Solution
About the RoleDo you thrive on solving complex data challenges and envision how data can drive business transformation? Are you passionate about building and scaling cutting-edge data solutions? If so, this opportunity is for you!We are looking for aData Solution Architectto join our global team and play a pivotal role in advancing our in-house cloud-nativedata platform . This role acts as aField Engineer , ensuring the adoption and success of our platform by engaging deeply with engineering teams, setting technical standards, and driving best practices across the organization.
- Service Contract (PJ
- Independent Contractor)
- You will work at the forefront ofBig Data, Spark, and Lakehouse Architecture , collaborating with global teams in one of the world’s largestConsumer Packaged Goods (CPG) companies .
- Key Responsibilities·Act as atechnical leader , guiding engineering teams ondata architecture, ingestion frameworks, and best practices .
- ·Developreference architectures, how-to guides, and demo applicationsto scale adoption and implementation.
- ·ConductProof of Concepts (PoCs)to evaluate and select the best technology solutions.
- ·Support development teams in resolvingcomplex data challenges .
- ·Collaborate extensively withbusiness and technology stakeholdersto align solutions with business needs.
- ·Advocate for adata-driven approach , ensuring measurable success of data programs.
- ·Implement best practices insystems integration, security, performance, and data management .
- ·Ensuredata governance, compliance, and security best practicesacross all implementations.
- What You Bring·Strong programming skillsinPython, Scala, and Spark , with a passion for clean, maintainable code.
- ·Hands-on experience withbig data processingusingSpark and Kubernetes .
- ·Expertise indata orchestrationtools such asAirflowandAzure Data Factory .
- ·Experience withdata extractionfrom multiple sources (APIs, RDBMS, SAP, MongoDB, Oracle) with different ingestion methods ( full, incremental, CDC ).
- ·Strong understanding ofcloud-based data services(Azure, GCP).
- ·Proficiency inCI/CD pipelines and DevOpsfor data solutions.
- ·Excellentcommunication and presentation skills , including formal presentations and technical whiteboarding.
- ·Ability totranslate business needs into technical solutionsand collaborate with cross-functional teams.
- ·Proficiency indata modeling techniques and performance tuningfor large-scale data systems.
- ·Bachelor’s degree inComputer Science, Engineering, Mathematics, or a related field .
- Languages.
- Fluent Englishcommunication skills, both written and spoken.
- Bonus Points For·Knowledge of moderndata architectures(Lakehouse, Kappa, Lambda, Data Warehousing).
- ·Hands-on experience withObject-Oriented Programming (OOP) in Python .
- ·Experience inoptimizing data pipelinesfor performance and scalability.
- ·Ability to mentor junior engineers and provide insightfulcode reviews .
- ·Strongproblem-solving skillsand a self-driven,autodidacticmindset.
- ·Good understanding ofproduct development lifecycles .
- ·Fluent English(a must).
- Why Join Us?·Work in aglobal, fast-paced environmentwith a world-class data platform.
- ·Be at theforefront of data innovation , solving real-world business challenges.
- ·Collaborate with top talent in ahigh-impact, influential role .
- ·Enjoy opportunities forgrowth, mentorship, and continuous learning .
- If you are excited about making abig impact with data , apply now and be part of our journey!Location:
- RemoteModel: