Job Duties: Collect requirements, quantify and prepare detailed technical design documents. Convert business requirements into functional, system, performance and UAT test specifications/scenarios to develop scripts. Analyze and verify automated and manual test approaches; execute UI functional, acceptance, integration, and system testing in programming language. Design, code and maintain automated scripts, functions/function libraries, database verification, behavior and data-driven tests. Implement behavioral driven development automation framework using Cucumber, Selenium and Java. Design framework with implementation of data driven, keyword, hybrid, POM Model and execute automated test cases for web based applications, mobile applications and web services.
Perform enterprise analysis, involving examination of the business problem and its proposed business solution, its risks, and its feasibility, developed AS IS and TO BE process flows, and relevant architecture and analysis artifacts. Use Agile (SAFe, Kanban) and Waterfall software development life cycle including requirement analysis, project planning, project design, project execution, issue tracking and reporting for onsite-offshore delivery model. Design, install, test and maintain software systems.
Install and configure Informatica MDM components. Perform data modeling, data mapping, data validation, data standardization; define business rules for cleansing, matching, merging and maintaining data. Create and configuring landing tables, staging tables, base objects, hierarchies, foreign key relationships, lookups in the hub, queries, query groups and packages in MDM
The selected candidate must have a deep understanding of server and/or desktop administration, as well as SAS software usage. For SAS solutions to be implemented and work optimally, three key components need to be configured and managed effectively: 1) Software to hardware installation; 2) Metadata management; and 3) User configuration and access. Administering all three components requires skills and training with server and desktop administration, in addition to SAS software configuration, including Metadata. The candidate must understand hardware and software interactions in order to optimize the system settings that will drive out the best solution performance and ensure proper integration of SAS Enterprise Business Intelligence tools.
This role requires an enquiring mind, perseverance and the ability to handle many concurrent projects. The candidate must write well and deal confidently with people at all levels in the organization. Developers will assist researchers in use of products deployed on DAP.
JAVA experience and Web Service automation using JAVA. File handling using JAVA and scripting languages like shell scripts JAVA collections
Duties: Gather business requirements to analyze and provide technical specifications to testing working web applications and web services using Java and Selenium. Test the application by creating detailed test plans, test cases and automate testing effort. Responsible for SQL Server database testing and testing stored procedures, SQL queries using SQL Server. Automate integration testing process to meet project requirements. Perform quality analysis of web applications and web services. Develop automated test cases using Selenium Web driver based on Functional requirements. Perform analysis on defects and work to improve quality using Quality Centre and HP AGM. Perform continuous integration of test suite with BitBucket and Jenkins. Use JAVA, JCL, XML, COBOL and PL/SQL, MS SQL Server, Oracle, DB2, MS OLAP Services Manager, Windows and UNIX, UFT, Selenium Web Driver, Jenkins, Test NG, Cucumber, Bit Bucket and Eclipse, Rocket Shuttle, Share point, HP ALM and HP AGM.
Duties: Work on Linux (RHEL, SUSE), Unix (Solaris and AIX) and VMware. Perform VM creation, cloning and migrations of the VMs on VMware vSphere. Perform administration of VMware ESX Infrastructure, vSphere Client. Reconfigure Idoms with new architecture and move to different Vlan. Install and configure Puppet. Perform Linux internals and utilities (kernel, Memory, Swap, CPU). Perform Veritas volume manager and Veritas clusters running in Solaris and Linux running environments. Analyze network related issues with packets captured through TCPdump and Wireshark. Write shell scripts to automate and reduce work complexity. Engineer SA-related solutions. Perform root cause analysis. Remediate vulnerabilities. Handle critical incidents related to Solaris, Vxvm, vcs. Perform bug fixes related to Solaris. Provide end-to end support for Linux/Centos VMs provisioned through Azure. Perform Apache Tomcat configuration through ONEOPS. Provide performance tuning of LINUX VMs. Install, configure and troubleshoot IBM Infosphere Streams for managing real-time data processing. Install, configure, performance tuning and troubleshoot Apache Kafka. Install and administer Zookeeper. Administer Redis and clustered Redis.
Duties: Develop and validate solution architecture to support business requirements. Develop Spark applications to read transaction data and process business rules to report errors and transaction summary. Load and transform large sets of structured and semi structured. Read data from Kafka and process data using Spark. Adopt innovative architectural approaches to leverage in-house data integration capabilities. Analyze existing processes and prepare functional and requirements documents. Configure Spark Streaming to process the received weekly data via SFTP/Portal to MapR-FS and store the streamed data to Kafka topic. Develop multiple spark streaming and core jobs with Kafka as a data pipe-line system. Load the D-Stream data into Spark RDD and complete in-memory data computation to generate the Output response. Work on NoSQL databases like HBase/ MapR-DB in creating tables to load large sets of JSON data using Spark-HBase connector. Load the HBase data into Redshift cluster using Spark Structured streaming. Hive external tables on HBase using Insert Overwrite with S3 as data storage. Troubleshoot developed Spark jobs. Manage and review Yarn Application Logs, Spark Event Logs and Metrics sink CSV files. Improve the performance and optimization of the existing jobs in Spark. Develop solutions by utilizing commercial and open source software including Minifi and Nifi to interface big data and relational solutions. Load data from SQL server DB on Azure to Redshift Data warehouse using Spark structured streaming. Work on S3 life cycle rules management, Redshift inline policy management to load, unload or coy data from S3. Work on Hive with S3 data store optimizations. Design and implement solutions for metadata, data quality, privacy management. Collaborate with subject-matter-experts to design and enable ad hoc data analysis and a robust data consumption platform. Support analytics team on data presentation and reporting that import query or direct data to Power BI using Redshift connector or ODBC driver using SSL.
Duties: Implement Object Oriented technologies, Web-based, Client-Server Architecture, Service Oriented Architecture (SOA) using J2EE technologies including Web services (SOAP/RESTFUL), Spring Boot, Spring Batch, Hibernate, AngularJS and Struts. Develop Mobile Apps for Android and IOS using KONY technology. Perform Software Development Life Cycle (SDLC) including analysis, estimations, design, development, testing and deployment. Build and deploy applications using Gradle, Maven and Ant build script. Perform JAVA/J2EE/ Drools based software development. Code, test, debug, implement, document, and develop programs using Drools. Research and analyze information to determine, recommend and plan implementation of modifications to existing system. Perform software system testing and validation procedures. Perform troubleshooting and diagnostics tasks. Provide plans for technical implementation activities. Prepare and document application code and procedures for technical issues related to application functionality. Evaluate interfaces between hardware, software and application components. Develop specifications and performance requirements. Use WebSphere, Weblogic, Apache Tomcat, RAD, Eclipse, HP Quality Center (QC), SQL, PL/SQL, Agile, Waterfall.