Create Email Alert

Email Alert for

ⓘ There was an unexpected error processing your request.

Please refresh the page and try again.

If the problem persists, please contact us with your issue.

Email address is already registered

You can always manage your preferences and update your interests to ensure you receive the most relevant opportunities.

Would you like to [visit your alert settings] now?

Success! You're now signed up for Job Alerts

Get ready to discover your next great opportunity.

Similar Jobs

  • Tekfortune Inc

    Data Engineer

    Columbus, OH, United States

    • Ending Soon

    Data Engineer Must have Skills 5+ Yrs of exp in Bigdata-Hive, Spark,AWS(Preferred) Exp in programming language- Python, Spark OS- Linux, Shell scripting Database, SQL proficient Devops- continous deployement-Jenkins Data lake, buiding data pipeline in Hadoop and AWS services like blue , S3, Athen Nice to have c, c++,Java (Nice to have) ny

    Job Source: Tekfortune Inc
  • Ohio State University Wexner Medical Center

    Enterprise Architect

    Columbus, OH, United States

    • Ending Soon

    Job Description Information Technology is responsible for the use of any computers, software applications, storage, networking and other hardware or physical devices, infrastructures and processes for creating, managing, securing and exchanging all forms of electronic data. It incorporates leading-edge techniques for collaboratively enhancing the p

    Job Source: Ohio State University Wexner Medical Center
  • PERRY proTECH

    Solution Architect

    Columbus, OH, United States

    Job Description Job Description Have you ever wanted to be an Employee Owner? We are looking for an analytical individual with a customer-centric mindset to join our IT team of experts in designing and implementing technology solutions for customers! This person will be responsible for assess the needs and expectations of customers, translating

    Job Source: PERRY proTECH
  • Renaissance Services

    Solutions Architect

    Columbus, OH, United States

    When you join Renaissance, you join a global leader in pre-K–12 education technology. Renaissance’s solutions help educators analyze, customize, and plan personalized learning paths for students, allowing time for what matters—creating energizing learning experiences in the classroom. Our fiercely passionate employees and educational partners have

    Job Source: Renaissance Services
  • Condado Tacos

    Solutions Architect

    Columbus, OH, United States

    Support Center, 777 Goodale Blvd., Columbus, Ohio, United States of America Req #11857 Wednesday, June 12, 2024 Who We Are: Condado Tacos is an energetic, colorful, place where you can live your best taco-marg-lovin' life and have an experience as unique as you. We aspire to make our restaurants a place to be who you are and to celebrate the indiv

    Job Source: Condado Tacos
  • Saxon Global

    Solutions Architect

    Columbus, OH, United States

    We are searching for a can-do, forward thinking, and service heart oriented Solution Architect for the Enterprise Payments - Payment Products team. The Payment Products team supports origination, servicing and core processing for Huntington credit card and debit card products. As a leader within our Application Development team, you will be a techn

    Job Source: Saxon Global
  • JPMorgan Chase

    Lead Architect

    Columbus, OH, United States

    A career with us is a journey, not a destination. This could be the next best step in your technical career. Join us. As a Lead Architect at JPMorgan Chase within the Consumer and Community Banking Administration , you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on

    Job Source: JPMorgan Chase
  • Tech M USAAvance Consulting

    Java Architect

    Columbus, OH, United States

    • Ending Soon

    Title- Java Architect Location- Columbus, OH (Remote) JD : ● Strong experience with front-end web development and framework ( React/Vue etc) ● Experience building scalable backend services with Java ● Experience with headless and composable architecture ● Experience with Relational databases like MS SQL Server and/or MySQL/PostgreSQL ● Strong under

    Job Source: Tech M USAAvance Consulting

Bigdata Architect

Columbus, OH, United States

Role: Bigdata ArchitectLocation: 50 West Town Street Columbus, Ohio 43215Complete Description:This resource would be required to be on-site Tuesdays and Thursdays and remote remaining days.50 West Town Street Columbus, Ohio 43215The Technical Specialist will be responsible for Medicaid Enterprise Data Warehouse (EDW) design, development, implementation, migration, maintenance, and operation activities. Works closely with Data Governance and Analytics team. The candidate will be closely with the Data Governance and Analytics team. Will be one of the key technical resources for data warehouse projects for various Enterprise Data Warehouse projects and building critical Data Marts, data ingestion to Big Data platform for data analytics and exchange with State and Medicaid partners. This position is a member of Medicaid ITS and works closely with the Business Intelligence & Data Analytics team.Responsibilities:• Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.• Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, Py Spark programs and UNIX shell scripts.• Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.• Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.• Create jobs in Hadoop using SQOOP, PYSPARK and Stream Sets to meet the business user needs.• Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.• Updating the production support Run book, Control M schedule document as per the production release.• Create and update design documents, provide detail description about workflows after every production release.• Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.• Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.• Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.• Participate in ETL/ELT code review and design re-usable frameworks.• Create Remedy/Service Now tickets to fix production issues, create Support Requests to deploy Database, Hadoop, Hive, Impala, UNIX, ETL/ELT and SAS code to UAT environment.• Create Remedy/Service Now tickets and/or incidents to trigger Control M jobs for FTP and ETL/ELT jobs on ADHOC, daily, weekly, monthly and quarterly basis as needed.• Model and create STAGE / ODS / Data warehouse Hive and Impala tables as and when needed.• Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment.• Work with Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.• Create re-usable UNIX shell scripts for file archival, file validations and Hadoop workflow looping.• Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.• Create PySpark programs to ingest historical and incremental data.• Create SQOOP scripts to ingest historical data from EDW Module Vendor’s databases to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.• Participate in meetings to continuously upgrade the Functional and technical expertise.REQUIRED Skill Sets:• 8+ years of experience with Big Data, Hadoop on Data Warehousing or Data Integration projects.• Analysis, Design, development, support and Enhancements of ETL/ELT in data warehouse environment with Cloudera Bigdata Technologies (with a minimum of 8-9 years’ experience in Hadoop, MapReduce, Sqoop, Py Spark, Spark, HDFS, Hive, Impala, Stream Sets, Kudu, Oozie, Hue, Kafka, Yarn, Python, Flume, Zookeeper, Sentry, Cloudera Navigator) along with Oracle SQL/PL-SQL, Unix commands and shell scripting.• Strong development experience (minimum of 8-9 years) in creating Sqoop PY scripts, programs, HDFS commands, HDFS file formats (Parquet, Avro, ORC etc.), Stream Sets pipeline creation, jobs scheduling, hive/impala queries, Unix commands, scripting and shell scripting etc.• Writing Hadoop/Hive/Impala scripts (minimum of 8-9 years’ experience) for gathering stats on table post data loads.• Strong SQL experience (Oracle and Hadoop (Hive/Impala etc.)).• Writing complex SQL queries and performing tuning based on the Hadoop/Hive/Impala explain plan results.• Proven ability to write high quality code.• Experience building data sets and familiarity with PHI and PII data.• Expertise implementing complex ETL/ELT logic.• Develop and enforce strong reconciliation process.• Accountable for ETL/ELT design documentation.• Good knowledge of Big Data, Hadoop, Hive, Impala database, data security and dimensional model design.• Basic knowledge of UNIX/LINUX shell scripting.• Utilize ETL/ELT standards and practices towards establishing and following centralized metadata repository.• Good experience in working with Visio, Excel, PowerPoint, Word, etc.• Effective communication, presentation, and organizational skills.• Familiar with Project Management methodologies like Waterfall and Agile• Ability to establish priorities & follow through on projects, paying close attention to detail with minimal supervision.• Required Education: BS/BA degree or combination of education and experience.DESIRED Skill Sets:• Demonstrate effective leadership, analytical and problem-solving skills.• Required excellent written and oral communication skills with technical and business teams.• Ability to work independently, as well as part of a team.• Stay abreast of current technologies in the area of IT assigned.• Establish facts and draw valid conclusions.• Recognize patterns and opportunities for improvement throughout the entire organization.• Ability to discern critical from minor problems and innovate new solutions.

Your Resume * (.doc,.docx,.pdf files are only allowed)

#J-18808-Ljbffr

Apply

Create Email Alert

Create Email Alert

Email Alert for Bigdata Architect jobs in Columbus, OH, United States

ⓘ There was an unexpected error processing your request.

Please refresh the page and try again.

If the problem persists, please contact us with your issue.

Email address is already registered

You can always manage your preferences and update your interests to ensure you receive the most relevant opportunities.

Would you like to [visit your alert settings] now?

Success! You're now signed up for Job Alerts

Get ready to discover your next great opportunity.