Onix is a global cloud, data, and AI solutions company and a 16-time Google Cloud Partner of the Year, helping enterprises accelerate their data-to-AI transformation through cloud migration, analytics, and AI-driven modernization. Headquartered in Princeton, New Jersey, Onix serves 1,500+ customers including Fortune 500 organizations and delivers proprietary, automation-led platforms that enable faster, cost-efficient digital transformation.
Key Responsibilities:
Design, develop, and maintain data pipelines and ETL/ELT workflows using GCP-native tools and services.
Build and optimize data warehouses using Snowflake.
Write complex and efficient SQL queries for data transformation, analysis, and reporting.
Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions.
Implement data governance, security, and monitoring best practices across GCP projects.
Tune queries and optimize performance of large-scale datasets.
Automate workflows using Cloud Composer (Airflow) or similar orchestration tools.
Required Skills & Qualifications:
3+ years of experience in a data engineering or data platform role.
Strong hands-on experience with Snowflake data warehousing
Expert-level skills in SQL — able to write optimized, scalable, and complex queries.
Experience with data modeling (star/snowflake schema), partitioning, clustering, and performance tuning in a data warehouse.
Familiarity with modern ELT tools such as dbt, Fivetran, or Cloud Data Fusion.
Experience in Python or similar scripting language for data engineering tasks.
Understanding of data governance, privacy, and Google Cloud Platform services, especially BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer.