Duties: Gather business requirements to analyze and provide technical specifications to testing working web applications and web services using Java and Selenium. Test the application by creating detailed test plans, test cases and automate testing effort. Responsible for SQL Server database testing and testing stored procedures, SQL queries using SQL Server. Automate integration testing process to meet project requirements. Perform quality analysis of web applications and web services. Develop automated test cases using Selenium Web driver based on Functional requirements. Perform analysis on defects and work to improve quality using Quality Centre and HP AGM. Perform continuous integration of test suite with BitBucket and Jenkins. Use JAVA, JCL, XML, COBOL and PL/SQL, MS SQL Server, Oracle, DB2, MS OLAP Services Manager, Windows and UNIX, UFT, Selenium Web Driver, Jenkins, Test NG, Cucumber, Bit Bucket and Eclipse, Rocket Shuttle, Share point, HP ALM and HP AGM.
Duties: Work on Linux (RHEL, SUSE), Unix (Solaris and AIX) and VMware. Perform VM creation, cloning and migrations of the VMs on VMware vSphere. Perform administration of VMware ESX Infrastructure, vSphere Client. Reconfigure Idoms with new architecture and move to different Vlan. Install and configure Puppet. Perform Linux internals and utilities (kernel, Memory, Swap, CPU). Perform Veritas volume manager and Veritas clusters running in Solaris and Linux running environments. Analyze network related issues with packets captured through TCPdump and Wireshark. Write shell scripts to automate and reduce work complexity. Engineer SA-related solutions. Perform root cause analysis. Remediate vulnerabilities. Handle critical incidents related to Solaris, Vxvm, vcs. Perform bug fixes related to Solaris. Provide end-to end support for Linux/Centos VMs provisioned through Azure. Perform Apache Tomcat configuration through ONEOPS. Provide performance tuning of LINUX VMs. Install, configure and troubleshoot IBM Infosphere Streams for managing real-time data processing. Install, configure, performance tuning and troubleshoot Apache Kafka. Install and administer Zookeeper. Administer Redis and clustered Redis.
Duties: Develop and validate solution architecture to support business requirements. Develop Spark applications to read transaction data and process business rules to report errors and transaction summary. Load and transform large sets of structured and semi structured. Read data from Kafka and process data using Spark. Adopt innovative architectural approaches to leverage in-house data integration capabilities. Analyze existing processes and prepare functional and requirements documents. Configure Spark Streaming to process the received weekly data via SFTP/Portal to MapR-FS and store the streamed data to Kafka topic. Develop multiple spark streaming and core jobs with Kafka as a data pipe-line system. Load the D-Stream data into Spark RDD and complete in-memory data computation to generate the Output response. Work on NoSQL databases like HBase/ MapR-DB in creating tables to load large sets of JSON data using Spark-HBase connector. Load the HBase data into Redshift cluster using Spark Structured streaming. Hive external tables on HBase using Insert Overwrite with S3 as data storage. Troubleshoot developed Spark jobs. Manage and review Yarn Application Logs, Spark Event Logs and Metrics sink CSV files. Improve the performance and optimization of the existing jobs in Spark. Develop solutions by utilizing commercial and open source software including Minifi and Nifi to interface big data and relational solutions. Load data from SQL server DB on Azure to Redshift Data warehouse using Spark structured streaming. Work on S3 life cycle rules management, Redshift inline policy management to load, unload or coy data from S3. Work on Hive with S3 data store optimizations. Design and implement solutions for metadata, data quality, privacy management. Collaborate with subject-matter-experts to design and enable ad hoc data analysis and a robust data consumption platform. Support analytics team on data presentation and reporting that import query or direct data to Power BI using Redshift connector or ODBC driver using SSL.
Duties: Implement Object Oriented technologies, Web-based, Client-Server Architecture, Service Oriented Architecture (SOA) using J2EE technologies including Web services (SOAP/RESTFUL), Spring Boot, Spring Batch, Hibernate, AngularJS and Struts. Develop Mobile Apps for Android and IOS using KONY technology. Perform Software Development Life Cycle (SDLC) including analysis, estimations, design, development, testing and deployment. Build and deploy applications using Gradle, Maven and Ant build script. Perform JAVA/J2EE/ Drools based software development. Code, test, debug, implement, document, and develop programs using Drools. Research and analyze information to determine, recommend and plan implementation of modifications to existing system. Perform software system testing and validation procedures. Perform troubleshooting and diagnostics tasks. Provide plans for technical implementation activities. Prepare and document application code and procedures for technical issues related to application functionality. Evaluate interfaces between hardware, software and application components. Develop specifications and performance requirements. Use WebSphere, Weblogic, Apache Tomcat, RAD, Eclipse, HP Quality Center (QC), SQL, PL/SQL, Agile, Waterfall.