Google Cloud Platform Data Architect (Remote)
Posted 2025-04-06
Remote, USA
Full-time
Immediate Start
Role: Sr Data Architect (Google Cloud Platform) Location: Remote... Duration: 12+ Months Key responsibilities: Â Lead the requirement gathering process and create comprehensive technical designs (high-level and detailed). Â Develop a robust data ingestion framework for diverse data sources. Â Participate actively in architectural discussions, perform system analysis (reviewing existing systems and operational methodologies), and analyze emerging technologies to propose optimal solutions that address current needs and simplify future modifications. Â Design data models suitable for transactional and big data environments, serving as input for machine learning processing. Â Design and build the necessary infrastructure to enable efficient ETL from various data sources, leveraging Google Cloud Platform services. Â Develop data and semantic interoperability specifications. Â Collaborate with the business to define and scope project requirements. Â Partner with external vendors to facilitate data acquisition. Â Analyze existing systems to identify suitable data sources. Â Implement and continuously improve data automation processes. Â Champion continuous improvement in DevOps automation. Â Provide design expertise in Master Data Management, Data Quality, and Meta Data Management. Required Skills: Â Active Google Cloud Data Engineer or Google Professional Cloud Architect Certification. Â Minimum 8 years of experience designing, building, and operationalizing large-scale enterprise data solutions and applications using Google Cloud Platform data and analytics services alongside 3rd party tools (Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/Composer, BigTable, Cloud BigQuery, Cloud Pub/Sub, Cloud Storage, Cloud Functions, & Github). Â Minimum 5 years of experience performing detailed assessments of current data platforms and crafting strategic migration plans to Google Cloud Platform cloud. Â Strong Python development experience (mandatory). Â 2+ years of data engineering experience with distributed architectures, ETL, EDW, and big data technologies. Â Demonstrated knowledge and experience with Google Cloud BigQuery (mandatory). Â Experience with DataProc & DataFlows using Java on Google Cloud Platform. Â Experience with serverless data warehousing concepts on Google Cloud. Â Experience with DWBI modelling frameworks. Â Strong understanding of Oracle databases and familiarity with GoldenGate is highly desired. Â Expertise in Debezium and Apache Flink for change data capture and processing. Â Experience working with both structured and unstructured data sources using cloud analytics platforms (e.g., Cloudera, Hadoop). Â Experience with Data Mapping and Modelling. Â Experience with Data Analytics tools. Â Proven proficiency in one or more programming/scripting languages: Python, JavaScript, Java, R, UNIX Shell, php, or ruby. Â Experience with Google Cloud services: Streaming + Batch, Cloud Storage, Cloud Dataflow, DataProc, DFunc, BigQuery, BigTable. Â Knowledge and proven use of contemporary data mining, cloud computing, and data management tools: Microsoft Azure, AWS Cloud, Google Cloud, Hadoop, HDFS, MapR, and Spark. Â Bachelor's degree or equivalent (minimum 12 years) work experience. Arvind Kumar Bind || SPAR Information Systems || Phone: || || Email: || Web:
Apply Job!