JMETER RESUME EXAMPLES
Resume Builder
Edit this resume to make it your own!

Your Name

Performance Engineer

Fairfax, VA

your.email@example.com
111-222-3333
www.your-website.com

Work Experience

Performance Engineer

Export-Import Bank of the US, Washington, DC

Mar 2011Current

Responsibilities:
  • Involved in the development of Performance Test plan and Test strategy and also performed functional analysis of the system.
  • Involved in Performance Test cases preparation & execution through careful analysis of system.
  • Identified, managed project risks, dependencies and issues.
  • Implemented HP Quality Center for Test Planning, Test Case writing, Test Execution and Requirement Mapping with Test Cases.
  • Involved in defect tracking and reporting using Quality Center.
  • Used Quality Center to maintain, and to execute QTP and Manual Test Scripts, and to manage defects.
  • Performed Database Validation to check the updated data in the database using SQL quires.
  • Wrote PL/SQL statements to test the result of deployment for correct business logic.
  • Used JMeter for Database Backend Testing with JDBC & ODBC Connection.
  • Involved in developing, executing and maintaining test scripts using LoadRunner, while effectively interfacing with other team members and internal clients to resolve problems.
  • Involved in testing Server hardening and Server health check with Open source Tools like JMeter & BadBoy.
  • Involved using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Topas in UNIX system.
  • Involved in quickly debug LoadRunner issues and problems that stop execution.
  • Designed a Performance Framework using LoadRunner.
  • Provided analysis graph for latency (transaction response time), throughput, pages per sec served and system resources graph for key objects, like CPU utilization, Memory, and Thread usage using LoadRunner
  • Gathered the results from each test run and conducted in-depth analysis on the transaction response times and the performance of each server using LoadRunner.
  • Generated detailed reports that include graphs and tables for various performances object counters and application transaction times using LoadRunner.
  • Evaluated and Reported Test Results & the overall progress periodically to the Project Management.
  • Provided assistance to Project managers to develop and maintain testing schedules in MS Project.
  • Worked closely with Release Management team for all the upcoming builds and releases.

Environment: HP Quality Center, LoadRunner, JMeter, Windows, .NET, SQL Server, C/C++, VB Script, HTML, MS Office, XML, Oracle and UNIX.

Performance Analyst

Cingular Wireless, Redmond, WA

Jul 2009Feb 2011

Responsibilities:
  • Involved in build process to set up the application for QA testing using the application installers, database scripts and configuring.
  • Participated in developing highly readable documentation, which included: Traceability Matrix, Data flow diagrams, integration designs on project requirements and system specifications.
  • Responsible for creating and modifying documents like test strategy, test plan, test cases, signoff documents and daily, weekly status reports.
  • Analyzed and identified problems with the existing QA process and continuously made the process improvements to improve the test coverage and reduce the test cycles.
  • Used HP Quality Center to Create Test Plan and Test Cases and to schedule, track, report status and manage issues for Testing.
  • Performed Defect Management using Quality Center.
  • Responsible for scheduling the batch execution of the tests, log and track defects using Quality Centre.
  • Wrote SQL Quires to perform Backend database testing.
  • Identified the business scenarios for performance testing.
  • Involved in Preparation of Test data used during the execution of load test.
  • Prepared of Load and Performance Test Strategy (test scope and objectives, environments, resources, risks involved and mitigation plan, test tools, test techniques, determine testing execution approaches, performance test result analysis, test report) and walk through it to team and the client.
  • Designed and implemented a life-cycle approach to Application Performance Testing (pre-deployment) and Monitoring (post deployment) using LoadRunner.
  • Enhanced existing Performance testing framework, leveraging QA software products and tools.
  • Used protocols such as JAVA/EJB/RDP/Web Services/RTE/Custom Scripts/Citirx etc.
  • Prepared Load/Performance Test Design document and involved in preparing load test scripts, customization of scripts, execution, analysis and reporting using LoadRunner.
  • Coordinated with different teams/vendors during performance testing using LoadRunner.
  • Performed Cross browser testing is used on Internet explorer and Netscape navigator for browser compatibility.
  • Determined the Entry and Exit Criteria for different phases in testing cycle of the system.
  • Performed User Acceptance Testing on behalf of End Users at client's environment.
  • Participated in Release Review/Requirement Analysis and Design review meetings.

Environment: LoadRunner, Windows, ASP.NET, SQL Server, Team Foundation Server, Jmeter, Quality Center.

Performance Tester

State farm, Bloomington, IL

Feb 2007Jun 2009

Responsibilities:
  • Involved in creating Performance Test plans and Test Cases for the application based on system requirements.
  • Analyzed application to find out which part can be automated and which can be manually tested.
  • Created USE CASE for the system interfaces.
  • Performed functional testing, regression testing, integration testing, system compatibility testing, and user Acceptance testing.
  • Involved in Cross browser testing is used on Internet explorer and Netscape navigator for browser compatibility.
  • Involved in generating test plan, test cases and executing test cases using TestDirector.
  • Used TestDirector to track bugs and generate reports.
  • Wrote SQL queries to test the oracle database and validate the data integrity.
  • Participated in capacity planning and installation of LoadRunner software including Controller, Agent, Load Generators etc.
  • Conducted Performance testing under off load and peak load conditions using LoadRunner.
  • Performed load testing, stress testing and performance testing using LoadRunner by scripting VUser scripts for multiple users, multiple transactions at rendezvous points.
  • Involved in supporting in the 'User Acceptance Testing (UAT) ' phase for the project.
  • Involved in Production Support and Testing.
  • Worked closely with engineering, SCCM, and build release teams responsible for server side installations.
  • Actively participated and followed Agile testing methodology.
  • Responsible for resource and work allocation.

Environment: LoadRunner, TestDirector, SQL Server, Window, Oracle, Linux and Java.

Education

Bachelor of Business Admin in International Management

Virginia Commonwealth University, Richmond, VA

Additional Information

Highlights of Qualifications:
  • Over 6 years of experience in requirements gathering, issue tracking, reporting, and results analysis of performance testing.
  • Experienced in developing and maintaining business testing plans which meet the requirements of change requests.
  • Experienced in various project execution methodologies - RUP, SDLC, Agile/Scrum, Iterative, and Waterfall.
  • Extensively experienced in Manual/Automated testing for Client/Server and Web-based applications on several platforms, using - HP Quality Center/Mercury TestDirector, LoadRunner, JMeter, Performance Center.
  • Comprehensive exposure to the complete Software Development (SDLC) and Software Testing (STLC) Life Cycle projects including requirements gathering & analysis, estimation, test planning, execution and defect tracking & reporting.
  • Experienced of defect creation in Defect Management tool like Quality Center and TestDirector for assigning defect, retesting defect and generating reports.
  • Experienced in maintaining the Requirements Traceability Matrix (RTM) to ensure comprehensive test coverage of requirements in Quality Center.
  • Experienced in playing Admin Role for Quality Center and have in-depth knowledge of all features in Quality Center with configurations, customization, dashboard reporting and versioning.
  • Experienced in designing standard QA metric reports for bug tracking and reporting using Quality Center.
  • Excellent Database knowledge for Back-end testing using Oracle, MS SQL Server & MS Access.
  • Experienced in creating Performance test plans, developing, executing and maintaining test scripts using LoadRunner.
  • Ability to quickly debug LoadRunner issues and problems that stop execution.
  • Experienced in testing Server hardening and Server health check with Open source Tools like Jmeter & BadBoy.
  • Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection.
  • Experienced using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Topas in UNIX system.
  • Experienced in designed a Performance Framework using LoadRunner.
  • Experienced in load testing, stress testing and performance testing using LoadRunner by scripting VUser scripts for multiple users, multiple transactions at rendezvous points.
  • Self-starter with the ability to supervise and allocate tasks and responsibilities, define and establish procedures and manage multiple projects simultaneously.

TECHNICAL SKILL:
Testing Tools: Quality Center, Test Director, LoadRunner, Performance Center and Jmeter
Languages: C, C++, Java, Visual Basic, SQL, PL/ SQL, HTML, Shell Scripting
Web Applications: HTML, DHTML, XML
Operating Systems: Windows, UNIX/Linux
Databases: Oracle, MS SQL Server, MS Access, DB2
Web/Application Servers: Apache Tomcat, IIS, Web logic
Powered by Super-Resume