Big Data Engineer (Cloudera)

Location: Lakewood, CA
Date Posted: 11-22-2017
We are looking for an expert with proven experience on all the aspects of architecture, design and implementation of Data Management solution using Big Data platform on Cloudera or Hortonworks and other areas of enterprise application platforms. Experience working in Healthcare industry is preferred.

Responsibilities
• Convert concepts to technical architecture, design and implementation 
• Provide guidance on choosing ideal Architecture, Evaluating tools and Frameworks, define Standards & Best Practices for implementing scalable business solutions
• Implement Batch and Real-time data ingestion/extraction processes through ETL, Streaming, API, etc., between diverse source and target systems with structured and unstructured datasets 
• Design and build data solutions with an emphasis on performance, scalability, and high-reliability
• Code, test, and document new or modified data systems to create robust and scalable applications for data analytics
• Build data model for analytics and application layers
• Contribute to leading and building a team of top-performing data technology professionals
• Help with project planning and scheduling 
• Maintain confidentiality and comply with Health Insurance Portability and Accountability Act (HIPAA) 

Required Knowledge and Skills
• Expert level experience on Hadoop cluster components and services (like HDFS, YARN, ZOOKEEPER, AMBARI/CLOUDERA MANAGER, SENTRY/RANGER, KERBEROS, etc.)
• Ability to participate and lead, in solving technical issues while engaged with infrastructure and vendor support teams 
• Experience in building stream-processing systems, using solutions such as Kafka, Storm or Spark-Streaming 
• Proven experience on Big Data tools such as, Spark, Hive, Impala, Polybase, Phoenix, Presto, Kylin, etc.
• Experience with integration of data from multiple data sources (using ETL tool such, Talend, etc.)
• Experience building solutions with NoSQL databases, such as HBase, Memsql
• Strong experience on Database technologies, Data Warehouse, Data Validation & Certification, Data Quality, Metadata Management and Data Governance
• Experience with programming language such as, Java/Scala/Python, etc.
• Experience implementing Web application and Web Services APIs (REST/SOAP
or
this job portal is powered by CATS