Job Description |
Desired Qualifications:
- Experience working with Cloud Technology (MS Azure, AWS, GCP)
- Experience working with a Hadoop platform
- Programing skills and use of software packages/tools such as Python, SPARK, JAVA, SQL
- Process documentation experience
Familiarity with following technology/Platforms.
- Hadoop/Big Data: HDFS, Spark, Kafka, NIFI, MapReduce, Pig, Hive, Impala, HBase, Elastic search, Cassandra, Sqoop, Oozie, Zookeeper, Flume, Storm, YARN, MongoDB, Ranger, Mahout, Falcon, Avro, AWS.
- Java & J2EE Technologies: Core Java, Hibernate, spring, JSP, Servlets, Java Beans, JDBC, EJB 3.0, JDBC, JMS, JMX, RMI.
- IDE Tools: Eclipse, IntelliJ.
- Programming languages: Java, Python, Scala, C, C++, MATLAB, SAS, PHP, SQL, PL/SQL.
- Web Services & Technologies: XML, HTML, XHTML, JNDI, HTML5, AJAX, JQuery, JSON, CSS, JavaScript, AngularJS, VB Script, WSDL, SOAP and RESTful.
- ETL tools: Pentaho, Talend, Informatica (MDM, IDQ, TPT), Teradata.
- Databases: Oracle, SQL Server,BigSQL, MySQL, DB2, NoSQL.
- Application Servers: Apache Tomcat, WebLogic, WebSphere, JBoss.
- Tools: Maven, SBT, ANT, JUNIT, log4J.
- Operating Systems: Windows, UNIX, Linux, Mac OS.
|