Data Engineer

Posted 25 August 2025
LocationKuala Lumpur
Job type Permanent
Discipline GTS
ReferenceJ15702

Job description

We’re looking for people to join the Access family, who share our passion for believing in better, and who will help us continue to grow.   Love Work. Love Life. Be You. - is central to our success and how we give our customers the freedom to do more of what's important to them. What does Access offer you?  We offer a blended approach to office working, encouraging you to collaborate and connect in one of our thriving offices. We deliver on what we say, taking the development of our people seriously. We’ll work with you to progress your success plan and provide opportunities to accelerate your career.  On top of a competitive salary, our wellbeing days taking you to 25 days leave a year and a health contribution, you’ll also be able to choose from a range of benefits to suit you. We’re an organisation that likes to give back, so you’ll also have three charity days allocated to support a cause that matters to you.   Day-to-day, you will:  • Use your data engineering experience to provide high-quality, robust datasets to underpin Engineering Operations across the business • Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques. • Consume data directly from a wide array of sources (APIs, files, OData, graph databases etc.) and take the data from raw, through cleaned and validated states and on to model-ready metrics • Design and implement ETL pipelines that integrate with systems and support modern data workflows • Familiarity with handling both structured and unstructured data sources • Collaborate with cross-functional teams to deliver comprehensive data solutions • Drive innovation by identifying automation opportunities, experimenting with new tools and technologies Your skills and experiences might include: • 3-4 years of data engineering experience with demonstrated expertise in cloud-based platforms, especially Databricks, utilizing Spark and knowledge of the Lakehouse concept • Strong SQL and Python skills • Experience with API development and integration for data ingestion and system connectivity • Understanding of ETL processes and data pipeline design • Knowledge of CI/CD workflows and DevOps practices for data engineering • Understanding of pipeline orchestration and workflow management • Experience with Delta Lake and modern data lake architecture • Experience implementing thorough test processes to produce high-quality deliverables • Strong communication and problem-solving skills. Good to have: • Experience with Neo4j or other graph databases for complex relationship modeling • Understanding of Generative AI • Knowledge of MCP (Model Context Protocol) or similar AI communication protocols • Experience with modern workflow orchestration tools • An understanding of Data Modelling theory (preferably Kimball), or practical experience of Power BI data modelling (Star schema) What are we all about?  The Access Group is one of the largest UK-headquartered providers of business management software to small and mid-sized organisations in the UK, Ireland, USA and Asia Pacific. It helps more than 100,000 customers across commercial and non-profit sectors become more productive and efficient. Our products and solutions go beyond providing technology, we connect the right people with the right data, at the right time, through Access Workspace. At Access, we are committed to creating a welcoming and inclusive environment where everyone can thrive. If you're excited about this role, (even if your previous experience doesn't align perfectly), you might just be the perfect fit for us! We wholeheartedly believe in equality for all and the transformative power of diversity.   Why not join our vibrant team where you can love what you do, love how you live, and most importantly, be authentically you? Let's make a difference together.  Love Work. Love Life. Be You.