Resume Builder
Edit this resume to make it your own!

Your Name

Sr. Quality Analyst

Piscataway, NJ


  • Over fifteen years of IT experience including Software Development, Quality Assurance and Management.
  • Extensive knowledge of Software Quality Assurance Techniques, IEEE, CMM and ISO standards.
  • In depth knowledge of Software Development Life Cycle (SDLC), having a thorough understanding of all phases, including requirements, analysis/design, development, testing, deployment and documentation.
  • Extensive experience in converting business requirements into testable requirements and designing Test Cases and Test Scripts.
  • Experience in testing functionality, usability and presentability of software products/modules: Unit Testing, Multi Unit Testing, Integration Testing, System Testing, Regression Testing and Automation Testing.
  • Good communication and coordination skills with UAT teams, customers and inter-department interactions.
  • Experience in managing Test team that includes resource planning, training and schedules.
  • Extensive experience in managing Test Projects using Quality Center for Test Cases and Defects.
  • Lead experience with Test Automation using Worksoft Certify and Quick Test Professional.
  • Worked on large-scale projects for major companies such as AT&T, Weill Cornell Medical College, Siemens, CitiGroup, Verizon Wireless, Vonage, Telcordia, Dialogic and UNISYS.
  • Software development experience using C, C++and Shell scripts on UNIX.

Work Experience

Sr. Quality Analyst

AT&T, Middletown, NJ

Oct 2009Jul 2012

PMOSS - Performance Management Operations Support System
A Network Performance Management platform that provides the infrastructure required to provide real time alerting and monitoring of the IP, VoIP, and ATM/FR network performance for Operations Centers to proactively detect the network problem and repair in real time, by collecting measurement from network elements, processing them against predefined thresholds for exception alerting; it also provides the capability for performance reports based on the data collected from network elements. GPR (Global Performance Reports) is a web based Application where the testing was performed.

Responsibilities Include:
  • Based on Requirements, generated Test Plan, Test Cases, Test Summary documents for each of the release.
  • Took leadership role in following SDLC process in the group by training team members whenever needed.
  • Coordinated Defect management between Development, System Test and Upper management. Had weekly meetings with Teams and scheduled production patches and performed production testing.
  • Created a production monitoring method and implemented the process in the group.
  • Trained team with management of Requirements and Test Cases in Quality Center
  • Created Requirements Traceability Matrix using Req Pro.
  • Performed front end and backend testing and managed Test Execution status in Quality Center.
  • End to End Testing with Upstream Systems and Downstream Systems
  • Worked with UAT Teams in resolving issues during UAT testing.
  • Reproduced production problems in System Test Environment and verified fixes
  • Verified Data from external systems using UNIX and Database with SQL.
  • Created Performance Benchmarks for WEB Reports.
  • Created Automation Scripts using QTP for End User Test scenarios.

QualityAnalyst/Automaton Lead

Weill Cornell Medical College,

Mar 2008Jun 2009

Weill Cornell Medical College's primary financial, human resources & higher education systems are being upgraded with mySAP ERP ECC 6.0 software, enabling standard higher education & research solution best practices. As a QA, I am responsible for creation and validation of processes for SAP Implementation.

Responsibilities Include:
  • Created Data Validation Plan and presented to the Department managers and Stakeholders.
  • Conducted QA Audit of Blueprints, Configuration documents and Functional specifications for Critical Business Processes.
  • Coordinated SAP Portal testing with teams for Finance and Human Resources modules.
  • Created Automation Test Framework for SAP HR and Finance Modules.
  • Coordinated training for the team with the new automation Tool (Worksoft Certify)
  • Automated end to end scenarios for SAP HR and Finance Modules
  • Validated Unit Test Cases and reviewed results and provided status to Stakeholders.
  • Created Traceability Matrix and presented to teams
  • Created Templates for Integration test cases and Test readiness Checklists
  • Validated Regression Test Cases for HR and Finance and provided status to the teams.
  • Coordinated System Environment Change Control meetings.
  • Conducted Defects status meetings with the teams.

Test Manager/Configuration Manager

Siemens Transit Technologies,

Jul 2007Feb 2008

PACIS provides improved passenger information in the form of dynamic messages on Customer Information Screens and Public Address components for train systems. The System includes software and hardware procurement, design, integration, implementation and testing.

Responsibilities Include:
  • As a Configuration Manager for the project, coordinated with teams offshore as well as teams in local office in gathering system hardware and software configuration information.
  • Conducted weekly meetings to track/manage progress of the tasks.
  • Presented configuration process to the clients, which include Hardware, Software, COTS software and Documentation.
  • Updated System Configuration Management Plan and got it approved by the client.
  • Created System Configuration Design that describes all configuration items and the process of identification and storage.
  • Created checklists for Physical Configuration Audit and Functional Configuration Audit.
  • As a Test Manager implemented Defect Management process and presented to the team.
  • Conducted daily status meetings with Development teams and Integration test teams.
  • Participated in Design Document reviews.
  • Based on Use Cases, created Test Cases in Quality Center
  • Created weekly defects reports for defect status meeting
  • Created checklists for Implementation Reviews and Test Readiness Reviews.

Sr. Quality Analyst

CitiGroup, New York, NY

Oct 2006Jul 2007

TreasuryVision is a unique web-based service that increases visibility and control so that treasury organizations can view their overall positions and forecasts and more effectively manage global liquidity and risk across the enterprise. TreasuryVision provides multi-bank, multi-currency, multi-asset information aggregation; powerful analytic reporting and sophisticated treasury workflow tools like cash flow forecasting and account management. In addition, with TreasuryVision you can stay on top of industry and market developments through third-party research and worldwide news feeds. Worked on Liquidity Reporting project that generated reports from Treasury Vision.

Responsibilities include:
  • As a Lead Quality Analyst, reviewed Requirements and produced: System Test Plan, Test Estimations and Test Scenarios for project management. After approval, created System Test Cases. Selected Regression Test suite for projects under test.
  • Conducted meetings to review Test Plans and Test Cases with Analysts and Development teams.
  • Created Automation Test scripts and maintained existing scripts to work with new modifications to User Interface using QTP.
  • Generated reports of Test Execution status and Defects status.
  • Coordinated User Acceptance Test phase, including all logistics, test case execution and defect management.
  • Worked with Analysts, Development and UAT groups in resolving Defects for current projects.
  • Created a plan to resolve deferred defects from previous releases and worked with development in resolving them.
  • Created Regression scenarios for Applications and designed Automation Script strategy.
  • Created scripts in QTP as well as modified existing scripts.

Sr. Quality Analyst

Verizon Wireless, Township of Warren, NJ

May 2006Oct 2006

Verizon Wireless owns and operates the nation's most reliable wireless network. Responsibilities include managing projects for current release, interact with Analysts, development and Users in resolving issues such as schedules, status and test environments.

Responsibilities include:
  • As Sr. Quality Analyst, conducted meetings and produced training documents to manage Requirements, Test Cases, Test Lab and Defects.
  • Generated reports of Test Execution status and Defects status in Quality Center and posted them on the web site.
  • Using Sharepoint, created web sites for POS Quality Assurance groups to retrieve and access common information.
  • Prepared Test plans, test cases in Quality Center.
  • Worked with Analysts, Development and UAT groups in resolving Defects for current projects.
  • Created Regression scenarios for Applications and designed Automation Script strategy.

Quality Analyst Team Lead

Vonage, Holmdel, NJ

May 2005May 2006

Vonage, a leading SIP-based Voice Over Ip (Voip) communications company with multiple web applications for areas of Customer Care (CCA), Billing, Local Number portability (LNP), 911 Selection, Web Accounts, Small Business, Residential, Subscribe, Wholesale, Sales and Marketing, Canada Market, UK Market, Credit Card Validation, New Features and Web Voicemail.

Responsibilities include:
  • As a Team Lead trained new members to learn Vonage Applications and environment.
  • Involved from Requirements stage to deployment stage of Software development cycle.
  • Prepared Test Plans, Test Cases and automated Test scripts according to requirements Quality Center.
  • Conducted reviews of Test plans and Test cases.
  • Performed Black Box testing, automation testing, Database testing and backend testing.
  • Performed testing with Third party vendors for Local Number Portability process.
  • Conducted Data Accuracy Testing, Boundary Testing and Performance testing.
  • Prepared test metrics using TestDirector/Quality Center to report daily test status and for testing analysis. Interacted with developers regarding defects opened.
  • Supported previous releases products for any field problems (Trouble Reports).
  • Performed production testing off hours (weekends and nights).
  • Environment:
Windows XP, TestDirector, Quality Center, SOAP, XML, Oracle 10G, QTP, Jmeter, Visual Basic, UNIX and AWK.

Business Analyst/Quality Analyst

Exim Group LLC, New York, NY

Sep 2004May 2005

Exim Group provides technical services and product procurement for International agencies. Project was to create an online web application for Order Entry Reporting System that facilitates bids, quotations, purchases and orders.

Responsibilities include:
  • Analyzed business requirements and created SRS (Software Requirement Specifications).
  • Based on Requirements created detailed Test plan for testing the functionality of the application.
  • Based on use cases and detailed designs constructed test cases and test scripts.
  • Performed Black Box testing and User acceptance testing according to Test Plan.
  • Coordinated and prioritized outstanding defects and enhancements/ system requests based on business requirements.

Software Engineer

Telcordia Technologies, Piscataway, NJ

Apr 1997Nov 2001

Project worked on was GUI software application that accessed and maintained SS7 Point Codes (Relational) database. I was also responsible for maintenance of the SS7 backend configuration table process. This process retrieves information from databases and processes events. I concurrently interacted with Customers on software issues.

As a Software Engineer performed following activities:
  • Generated Functional Requirements, Design documents.
  • Conducted Requirements, design and Test plan reviews.
  • Designed Application GUI and SS7 Database access routines.
  • Implemented quality Software in C/C++ on AIX.
  • Executed manual testing and Database testing.
  • Reproduced the field problems, made software changes and tested the Software. Used ONTRAC software defect tracking tool for creating Modifications requests.
  • Followed Telcordia QMO (Quality Management Operations) for software products.

SOAC (Service Order Analysis and Control) provides step-by-step service order control and tracking capabilities to form the core of a flowthrough provisioning process. In SOAC development group, delivered Software for more than five releases. Work included new functionality and enhancements to interfaces between SOAC/CNUM, SOAC/SOA, SOAC/Force and SOAC/ADSL, and DID Centrex features, with participation in requirements, detailed design, implementation, Unit Test, MUT (Multi Unit Testing) and regression testing. I concurrently worked with SOAC Customer Service Center to resolve customer issues.

As a Software Engineer performed following activities:
  • Interacted with Project managers, System and Test Engineers for all Software releases.
  • Generated design documents, test plan and user guide using Frame Maker.
  • Contributed to Requirements, design and test plan (STTS) reviews.
  • Implemented quality Software in C on UNIX and performed Unit, Multi Unit testing and Regression testing on OS1100 and MVS using MYNAH 4 & 5 testing tool.
  • Used ONTRAC software defect tracking tool for creating Modifications requests and worked with clients on software issues.
  • Followed Telcordia QMO (Quality Management Operations) procedures for software products.

Software Engineer/System Test

DIALOGIC Corporation, Parsippany, NJ

1996 1997

As the Software Project lead for Speech Products served as the liaison between customers and vendors. Was responsible for support and testing of C-based and C++ based Speech Products running on Windows/NT, UNIX, and OS/2 systems. Worked on VR160, TTS (Text to Speech) and Antares products. Developed, implemented and tested solutions to reported problems.

As a Software Lead performed following activities:
  • Conducted weekly status meetings with vendors and clients.
  • Interacted with Project managers, System Managers and QA Managers for all Software issues.
  • Reproduced Customer's Trouble Report (TR)
  • Wrote C/C++ programs to test Voice Recognition Application Programming Interface (API) on UNIX, AIX, OS/2 and Windows/NT.
  • Performed System testing and Regression testing.
  • Verified Software solutions and packaged these solutions to Clients
  • Performed Regression testing
  • Updated documents to track the changes.
  • Worked extensively with clients on Software issues.
  • Gave presentations to clients regarding new updates.

Member Technical Staff

AT&T Bell Laboratories, Summit, NJ

1987 1989

Summit, NJ MTS (1987 to 1989)
Enhanced network listener of UNIX System V R4, supporting network services and applications operating over TCP/IP. Implemented private addressing feature.

As a Member Technical Staff performed following activities:
  • Analyzed existing software and documents to incorporate new changes.
  • Created new workflows and documents to incorporate new changes.
  • Implemented software in C on UNIX.
  • Performed unit testing, multi unit testing and regression testing.

Software Engineer

UNISYS, Flemington, NJ

1984 1986

Designed and developed data communication software for a microprocessor based terminal that supported three different applications concurrently using Poll/Select Protocol using Assembly language. The testing of this firmware module was done with the "HP Logic/Protocol Analyzer". 
Designed and implemented a compiler for NDL (Network Definition Language), using PL/M on an NGEN workstation. Modules included are Lexical Analyzer, Syntax Analyzer and Code Generator.


B.S. in Computer Science

Rutgers University,

Additional Information

Technical Skills: 
Languages: C, C++, Java, UNIX/Curses, HTML, PL/M, BASIC, VB, SQL, FORTRAN and Assembly languages 
Data Communication: SIP, SS7, TCP/IP, PPP 
Databases: Oracle, ISS7, ISNA and other Relational Databases 
Tools: Quality Center, QuickTest Professional, ClearQuest, ClearCase, Jmeter, Oracle SQL Developer, TOAD, VI, SCCS, Purify, Frame maker, Microsoft Office
Powered by Super-Resume