Search for More Jobs
Get alerts for jobs like this Get jobs like this tweeted to you
Company: BMC Software
Location: Canada
Career Level: Mid-Senior Level
Industries: Technology, Software, IT, Electronics

Description

CareerArc Code CA-AJ Remote: #LI-Remote BMC empowers nearly 80% of the Forbes Global 100 to accelerate business value, faster than humanly possible. Our industry-leading portfolio unlocks human and machine potential to drive business growth, innovation, and sustainable success. BMC does this in a simple and optimized way by connecting people, systems, and data that power the world's largest organizations so they can seize a competitive advantage. Title: Data Architect (Hands-On Builder)
Location: Canada Remote 
Data Architect will focus on the design, build, and evolve a modern, scalable data architecture that will power analytics, reporting, AI/ML use cases, and operational decision-making across the GTM business. This role will define data patterns and standards, partner with IT, and business partner teams to ensure our data platform is secure, reliable, cost-efficient, and easy to use.
This role is ideal for someone with strong hands-on architecture depth and can define target-state architecture, guide implementation, and ensure high-quality delivery across pipelines, integrations, and governance.
Key Responsibilities:

  • Build and Own Core Data Pipelines (Hands-On)
  • Architect and implement end-to-end ingestion and transformation pipelines from source systems into Snowflake.
  • Develop robust ELT/ETL workflows using dbt and orchestration tools (e.g., Airflow/Dagster/ADF), including:
  • incremental loads, SCD handling, backfills, and reprocessing
  • late-arriving data patterns and idempotent job design
  • CDC-based ingestion where applicable
  • Build integrations for key enterprise SaaS systems (Salesforce, Marketing Cloud, Netsuite, Zuora, Gainsight, CSOD) and internal app databases.
  • Data Modeling & dbt Development
  • Own the data modeling layer in Snowflake with dbt:
  • design dimensional models (star schemas), marts, and curated layers
  • implement dbt best practices: staging → intermediate → marts, modular models, macros, packages
  • define metric-ready datasets (e.g., ARR/NRR, pipeline, churn, product usage) with consistent definitions
  • Optimize for performance and cost (clustering, warehouse sizing, query patterns, caching, micro-partitioning awareness).
Cloud Platform Enablement (AWS + Azure)
  • Implement secure, scalable data platform components across AWS and Azure:
  • landing zones, storage (S3/ADLS), networking, secrets management, and compute integration patterns
  • secure connectivity to Snowflake (PrivateLink/peering patterns where relevant)
  • Work with technology teams to implement RBAC, role hierarchy, masking policies, row access policies, and data sharing patterns.
Reliability, Testing, and Observability
  • Implement data quality and reliability controls:
  • dbt tests (schema, relationship, accepted values) and custom tests for business logic
  • anomaly detection and pipeline monitoring (e.g., Datadog/CloudWatch/Azure Monitor, Monte Carlo if applicable)
  • SLAs/SLOs for critical datasets and clear incident runbooks
  • Build operational readiness: logging, alerting, retries, failure isolation, and safe deploys.
CI/CD, Version Control, and Engineering Practices
  • Build and maintain CI/CD for data development (PR checks, dbt builds, environment promotion).
  • Establish practical engineering standards:
  • naming conventions, repo structure, branching strategy
  • documentation that stays current (dbt docs, data dictionaries)
  • code reviews and design patterns that scale with the team
AI / Agentic AI Architecture
  • Design and implement architecture patterns that support AI/ML, GenAI, and agentic AI use cases across the Sales business, including structured and unstructured data pipelines, retrieval-ready data design, and secure enterprise data access.
  • Define scalable patterns for AI-enabled applications and agents, including metadata design, indexing, vector-ready data preparation, API/tool access, and governance controls such as lineage, auditability, observability, and guardrails.
  • Partner Closely with Analytics & Business Teams
  • Collaborate with BI/Analytics to ensure the curated layer supports dashboards and self-service.
  • Translate business needs into data products quickly with tight feedback loops, iterative delivery, measurable outcomes. 

Required Qualifications
  • 10+ years of experience in data engineering and data architecture with a strong hands-on track record.
  • Strong production experience with Snowflake (performance tuning, data loading patterns, security model).
  • Strong production experience with dbt:
  • incremental models, snapshots, tests, macros, documentation, deployments
  • Experience building pipelines and integrations in AWS and/or Azure (both preferred).
  • Advanced SQL skills and strong understanding of data modeling (dimensional modeling, SCD types).
  • Experience with orchestration (Airflow, Dagster, Prefect, or Azure Data Factory) and reliable job design.
  • Strong understanding & framework implementation of data quality, monitoring, and operational best practices.

Preferred Qualifications (Nice to Have)
  • Experience with ingestion tooling (Boomi, Fivetran, Matillion, Informatica, Stitch) and/or CDC tooling (Debezium/HVR).
  • Experience designing or supporting data architecture for GenAI, RAG, enterprise search, or agentic AI use cases in production environments.
  • Familiarity with modern AI-related technologies and patterns such as OpenAI APIs, LangChain/LangGraph, Snowflake Cortex, Azure AI services, vector-ready data pipelines, and secure tool integration for AI agents.
  • Experience with streaming/event ingestion (Kafka/Kinesis/Event Hubs) for near-real-time pipelines.
  • Familiarity with governance/security patterns in Snowflake (masking/row access, tags, classification).
  • Exposure to BI layers (Tableau/Power BI/Looker) and semantic/metrics frameworks.
  • Python experience for custom integrations, APIs, and automation (Lambda, Azure Functions). 
#LI-Remote 

Our commitment to you! 

 

BMC's culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won't be known just by your employee number, but for your true authentic self. BMC lets you be YOU! 


If after reading the above, You're unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! 

 

BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender,  gender expression,  gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran.  If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page.

BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process.

The annual base salary range represents the low and high end of the BMC salary range for this position. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. 

The range listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits.

At BMC, it is not typical for an individual to be hired at /near the top of the range. A reasonable estimate of the current range is $137,850 - $229,750

Min salary 137,850 Mid point salary 183,800 Max salary 229,750

recruiter_code


 Apply on company website