Nitor Infotech logo

BigData Engineer / Architect

Nitor Infotech
Full-time
On-site
Portland, Oregon, United States

Company Description

The hunt is for a strong Big Data Professional, a team player with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers.

Role: Big Data Engineer

Location: Portland OR.

Duration: Full Time

Skill Matrix:

Map Reduce - Required
Apache Spark - Required
Informatica PowerCenter - Required
Hive - Required
Apache Hadoop - Required
Core Java / Python - Highly Desired
Healthcare Domain Experience - Highly Desired

     

      Job Description

      Responsibilities and Duties

      • Participate in technical planning & requirements gathering phases including architectural design, coding, testing, troubleshooting, and documenting big data-oriented software applications.
      • Responsible for the ingestion, maintenance, improvement, cleaning, and manipulation of data in the business’s operational and analytics databases, and troubleshoots any existent issues.
      • Implementation, troubleshooting, and optimization distributed solutions based on modern big data technologies like Hive, Hadoop, Spark, Elastic Search, Storm, Kafka, etc. in both an on premise and cloud deployment model to solve large scale processing problems
      • Design, enhance and implement ETL/data ingestion platform on the cloud.
      • Strong Data Warehousing skills, including: Data clean-up, ETL, ELT and handling scalability issues for enterprise level data warehouse
      • Capable of investigating, familiarizing and mastering new data sets quickly
      • Strong troubleshooting and problem-solving skills in large data environment
      • Experience with building data platform on cloud (AWS or Azure)
      • Experience in using Python, Java or any other language to solving data problems
      • Experience in implementing SDLC best practices and Agile methods.

      Qualifications

      Required Skills:

      • Data architecture/ Big Data/ ETL environment
      • Experience with ETL design using tools Informatica, Talend, Oracle Data Integrator (ODI), Dell Boomi or equivalent
      • Big Data & Analytics solutions Hadoop, Pig, Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure (HDInsight, Data Lake Design)
      • Building and managing hosted big data architecture, toolkit familiarity in: Hadoop with Oozy, Sqoop, Pig, Hive, HBase, Avro, Parquet, Spark, NiFi
      • Foundational data management concepts - RDM and MDM
      • Experience in working with JIRA/Git/Bitbucket/JUNIT and other code management toolsets
      • Strong hands-on knowledge of/using solutioning languages like: Java, Scala, Python - any one is fine
      • Healthcare Domain knowledge

      Required Experience, Skills and Qualifications

      Qualifications:

      • Bachelor’s Degree with a minimum of 6 to 9 + year’s relevant experience or equivalent.
      • Extensive experience in data architecture/Big Data/ ETL environment.

      Additional Information

      All your information will be kept confidential according to EEO guidelines.