About the Role
We are looking for a GCP Data Analyst with strong expertise in BigQuery, advanced SQL and Python skills, and a keen analytical mindset. This position involves supporting data validation efforts and ongoing analytics tasks. The ideal candidate will be adept at working with large datasets, writing efficient queries, and identifying data issues with accuracy and insight.
You will work across various data processes, from validating metrics during system migrations to supporting routine analysis and reporting. The role requires using advanced BigQuery capabilities—such as authorized views, materialized views, user-defined functions (UDFs), partitioning, and time series analysis—to maintain data quality and uncover actionable insights. Proficiency in Python, particularly with data frames and relevant libraries, is essential for data manipulation, anomaly detection, and workflow prototyping.
A strong grasp of data engineering principles and GCP infrastructure is important, along with the ability to read and understand Java or Scala code when collaborating with engineering teams. Familiarity with Airflow (Composer) is helpful for understanding data pipeline orchestration, though it is not a core responsibility. Experience with BigQuery ML, anomaly detection tools, or Vertex AI is a plus.
Key Responsibilities
– Write, optimize, and run complex SQL queries in BigQuery to validate data, detect inconsistencies, and support analytics.
– Analyze large datasets to evaluate data quality, compare trends across systems, and identify anomalies.
– Apply advanced BigQuery features like authorized views, materialized views, UDFs, partitioned tables, and joins for scalable analysis.
– Use Python and data frames for exploratory analysis, data manipulation, and validation workflows.
– Conduct time series analysis and anomaly detection using SQL or Python.
– Validate data loads and transformations to ensure pipeline accuracy.
– Collaborate with engineers to understand data pipelines, with the ability to read Java or Scala code as needed.
– Compare datasets across systems to ensure consistency during and after migrations.
– Understand orchestration tools like Airflow (Composer) to follow pipeline logic and collaborate effectively.
– Work within the GCP ecosystem, using cloud tools to analyze data, troubleshoot issues, and manage workflows.
– Communicate findings and data quality concerns clearly to stakeholders to support informed decisions.
Qualifications
– Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field.
– Over 5 years of experience in data analysis or analytics engineering, with strong skills in BigQuery, SQL, and Python.
– More than 5 years of experience working with Google Cloud Platform (GCP).
– Expertise in writing and optimizing SQL for data validation, trend analysis, and identifying discrepancies.
– Proficient in Python, including data frames and common analytical libraries.
– Experience using advanced BigQuery features like authorized views, materialized views, UDFs, partitions, and time series analysis.
– Strong analytical skills and experience validating data across systems during migrations and ongoing operations.
– Basic ability to read and interpret Java or Scala code to support collaboration with engineers.
– Familiarity with Airflow (Cloud Composer) to understand and trace data pipelines.
Preferred Qualifications
– Experience with Looker or other BI tools for metric validation and reporting.
– Knowledge of BigQuery ML and Vertex AI.
– Basic understanding of legacy systems like Oozie or Pig for reading existing scripts.
Required Skills
– Proficiency in SQL, BigQuery, and Python.
– Advanced SQL capabilities in BigQuery for complex validation, anomaly detection, and trend analysis.
– Experience comparing datasets across different systems.
– Proven ability to detect and investigate data discrepancies across platforms.
– Strong analytical intuition to validate metrics and identify issues that may not trigger alerts.
– Ability to compare metrics and trends side-by-side to confirm accuracy post-migration.
– Skilled in root cause analysis using SQL and domain knowledge.
– Effective communicator capable of documenting and sharing insights with both technical and non-technical audiences.
– Familiarity with time series analysis to identify unexpected metric changes.
– Ability to follow structured validation processes while suggesting workflow improvements.
Travel
– Travel as needed based on business requirements.
Compensation Transparency (for CA, CO, HI, NY, WA residents only)
– Colorado: $143,700 – $210,760
– Hawaii and New York (excluding NYC): $153,000 – $224,400
– California, New York City, and Washington: $167,400 – $245,520
Additional compensation may include bonuses, commissions, or other discretionary payments based on performance. Actual pay depends on various factors including experience, skills, certifications, and location. Learn more about benefits at https://rackspace.jobs/benefits.
About Rackspace Technology
Rackspace Technology is a leader in multicloud solutions, combining deep expertise with top technologies in applications, data, and security to deliver comprehensive solutions. We help customers solve business challenges, design scalable systems, manage and optimize solutions, and drive long-term value. Recognized as a top workplace by Fortune, Forbes, and Glassdoor, we are committed to attracting and developing top talent. Join us in embracing technology, empowering customers, and shaping the future.
More About Rackspace Technology
At Rackspace, we are united by a shared mission: to be a valued member of a winning team with a purpose-driven vision. We bring our authentic selves to work and believe that diverse perspectives drive innovation and help us better serve our global customers and communities. We are an equal opportunity employer and welcome applicants regardless of age, race, gender identity, disability, veteran status, or any other legally protected characteristic. If you need accommodations during the hiring process, please let us know.
To apply for this job, please visit jobs.lever.co