Data Engineer (TX Residents only)
Posted 2025-09-12
Remote, USA
Full Time
Immediate Start
Thrive in a culture of innovation and teamwork. We're hiring a Data Engineer! This Remote position offers an immediate start for the right candidate. This position requires a strong and diverse skillset in relevant areas to drive success. An attractive remuneration of a competitive salary is on offer for the successful candidate.
• ***************************State of Texas Residents Only****************************
Title: Data Scientist (Big Data Engineer) 2...
Location: Remote (Open for Texas residents only)
Responsibilities:
• Data Conversion and Reporting Support for System Modernization Efforts
• Data Transformation and Integration: Prepare and optimize data for migration to Snowflake and SAS Viya platforms, ensuring seamless integration and functionality by creating data transformation processes using ETL, SQL, Python, and R.
• Develop Federal and State Reports: Build comprehensive reports that meet federal and state requirements using Snowflake and SAS Viya, ensuring accuracy and compliance.
• Scrum Team Collaboration: Work as a member of an agile team to deliver new features and functions, delivering best-in-class value-based technology solutions.
• Data Quality Management: Develop and implement databases, ETL processes, data collection systems, and data quality strategies that optimize statistical efficiency, accuracy, and quality.
• Problem Examination and Resolution: Examine problems within the Data Intelligence space using ETL, Lambda, and Glue, and implement necessary changes to ensure data quality improvement.
• Data Analytics and Insights: Utilize advanced data analytics techniques to support strategic decision-making, ensuring data integrity, quality, and timeliness of results.
Qualifications:
• Proven experience in data conversion and report building using Snowflake and SAS Viya.
• Demonstrated experience in data transformation processes using ETL, SQL, Python, and R.
• Experience working with data analytics and business intelligence tools.
• Experience working in a Scrum or Agile development environment.
• Proficiency in ETL processes and tools such as AWS Glue and Lambda.
• Strong knowledge of database management and data warehousing concepts.
• Expertise in SQL for data querying and manipulation Apply Job!
Don't Hesitate, Apply!Don't worry if you don't meet every single requirement. We value a great attitude and a willingness to learn above all. Submit your application today!
• ***************************State of Texas Residents Only****************************
Title: Data Scientist (Big Data Engineer) 2...
Location: Remote (Open for Texas residents only)
Responsibilities:
• Data Conversion and Reporting Support for System Modernization Efforts
• Data Transformation and Integration: Prepare and optimize data for migration to Snowflake and SAS Viya platforms, ensuring seamless integration and functionality by creating data transformation processes using ETL, SQL, Python, and R.
• Develop Federal and State Reports: Build comprehensive reports that meet federal and state requirements using Snowflake and SAS Viya, ensuring accuracy and compliance.
• Scrum Team Collaboration: Work as a member of an agile team to deliver new features and functions, delivering best-in-class value-based technology solutions.
• Data Quality Management: Develop and implement databases, ETL processes, data collection systems, and data quality strategies that optimize statistical efficiency, accuracy, and quality.
• Problem Examination and Resolution: Examine problems within the Data Intelligence space using ETL, Lambda, and Glue, and implement necessary changes to ensure data quality improvement.
• Data Analytics and Insights: Utilize advanced data analytics techniques to support strategic decision-making, ensuring data integrity, quality, and timeliness of results.
Qualifications:
• Proven experience in data conversion and report building using Snowflake and SAS Viya.
• Demonstrated experience in data transformation processes using ETL, SQL, Python, and R.
• Experience working with data analytics and business intelligence tools.
• Experience working in a Scrum or Agile development environment.
• Proficiency in ETL processes and tools such as AWS Glue and Lambda.
• Strong knowledge of database management and data warehousing concepts.
• Expertise in SQL for data querying and manipulation Apply Job!
Don't Hesitate, Apply!Don't worry if you don't meet every single requirement. We value a great attitude and a willingness to learn above all. Submit your application today!