Ventois is always looking for talented people to become its team members. We realize that productive people are always a good addition to our organization. So if you have the financial acumen and the passion to work with some of the biggest companies in the world, join in!
- Interacting with Business Analyst and solution architects to gather the requirements and update the solutions.
- Creating Generic schemas, creating Context Groups and context Variables to run jobs against different environments like Dev, Test, and Prod.
- Involve in processing of converting JSON files to XML, and CSV files and maintained Audit log files and stats in Talend open studio; Decommissioning the legacy systems.
- Use AXWAY to FTP the OPTIM files to move to Hadoop and create tables on top of the data; Schedule CRONJOB to schedule the shell scripts.
- Use PIG predefined functions to convert the fixed width file to delimited file. Create internal table, Externals tables in HIVE, and merged the data sets using Hive joins.
- Use SQOOP for importing the different formats of flat files into HDFS. Handle incremental data loads from RDBMS into HDFS using SQOOP.
- Load and transform data into HDFS from large set of structured data /Oracle/SQL server using Talend Big data studio. Talend Data Integration, Talend Platform Setup on Windows and Unix systems.
- Create contexts to use the values throughout the process to pass from parent to child jobs and child to parent jobs. Use GIT as version control for the code and implemented branching for different environments.
- Involve in writing complex JAVA code using tJava, tJavarow, tJavaFlex and handled Heap Space Issues and memory related issues in Talend. Create complex mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, etc.
- AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration. Involve in creating AmazonS3 buckets using tS3BucketCreate, tS3BucketDelete, tS3Get, tS3List, tS3Put.
- Use TalendBigdata tool to load the big volume of source files from S3 to Redshift. Involve in designing Amazon Redshift DB Clusters, Schema and tables. Converting the data into relational format to load into Redshift.
- Efficiently involve in creating Talend Mappings to populate the data into dimensions and fact tables. Involve in using File Components like tAdvancedFileOutputXML, tFileInputExcel, tFileInputJSON, tFileInputMSXML, tFileInputXML, tFileOutputARFF, FileInputDelimited, tFileOutputDelimited, tFileOutputExcel, tFileOutputJSON
- Create job lets in Talend for the processes, which can be used in most of the jobs in a project like to Start job and Commit job. Develop jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency. Create jobs to perform record count validation and schema validation.
- Develop error-logging module to capture both system errors and logical errors that contains Email notification and moving files to error directories.
- In parallel to Development acted as a Talend Admin in Creating Projects/ Scheduling Jobs / Migration to Higher Environments & Version Upgrades.
- UseTalend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis. Developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
- Work in improving performance of the Talend jobs. Involve in migrating objects from DEV to QA and then promoting to Production.
- Great interpersonal communication skills;
- A keen eye for spotting data trends;
- Great analytical skills;
- A keen grasp of information technology;
- Professional demeanor;
- Personal accountability and strong work ethic;
- Professional, able to interact with vendors/clients;
- Positive, “can-do” attitude.