Showing posts with label performance. Show all posts
Showing posts with label performance. Show all posts

Sunday, October 22, 2017

CDrive free space was going down

My HDD free space is usually 30 GB. On weekends in last month, I noticed that machine performance was too slow. Then I checked HDD free space, which is less than 10GB. Used system cleanup for C Drive and got another 500 MB space.

Did few google search and found utility - TreeSize to get to know the utilized space by each directory.

Lastly found that Office cache had more than 30 GB for Office 2015 and Office 2016 versions. Removed all cache from following directories and Machine performance is so much better.
**Office Cache - C:\Users\\AppData\Local\Microsoft\Office\16.0\OfficeFileCache1
**Office Cache - C:\Users\\AppData\Local\Microsoft\Office\15.0\OfficeFileCache
**Office Cache - C:\Users\\AppData\Microsoft\Office\15.0\OfficeFileCache

Thursday, July 16, 2015

Dynamics AX Performance Testing by using Coded UI tests

Recently we have completed Dynamics AX performance testing by using functional test automation scripts, which were developer by using Visual Studio - Coded UI. Dynamics AX thick client performance testing was time consuming using X++ code approach - Benchmark Toolkit. I'm into Test Automation for many years and had few challenges to implement functional test scripts as performance Testing. Used Microsoft Terminal servers to simulate multiple users RDP sessions. Prepared the list of factors, which are considered as helpful in this methodology.

Best practices for coding
  • Don't open the AX thick client for each iteration. For example, Sales Order creation should be executed for 15 times per user. Open AX thick client only once per user and then create 15 Sales order without re-starting AX client
  • Open AX client from windows path. Don't use shortcuts to open. Also keep the same path should be available in all terminal servers.
  • Methods to log Start Time and End Time for particular transaction
  • Don't use too much descriptive OR heavy programs to find the controls. Keep in mind that Testing tool also will consume CPU and memory
  • Don't keep control identifying calls within the time measuring statements. There is a major difference on purpose of functional script and performance script.
  • You can use UI Map (Object Repository) OR Page object model
  • Try to use short-cut keys. It would simplify your script as well as increase the execution speed
  • Introduce the loop within the test method. Don't call test method multiple times
  • Use data sources to provide different test data for each user. Try to read once and then use, whenever it required
  • Implement time logging only at essential code. Also consider that Automation tool would take few seconds to identify particular screen/UI object
  • Run with few users and validate once the script/ scenario is complete
  • Avoid left click navigation by entering module directily in address bar
  • Use Sleep between transactions, but do not include in transaction timings
  • Refine the steps, wherever possible
  • Follow naming conventions for transaction name, which should be simple and ease of use


Best practices for Performanc Testing related
  • Transaction timings can be logged as XML, TXT, DB records etc. I would prefer to keep in Database. SO that you can save the data multiple times and also use queries to collect different kind of metrics like Average Response time, 90th Percentile response time, Minium Response time, Maximum Response time etc.
  • Create couple of methods to log start and end timings for each transaction
  • Capture the script errors as well as AX exceptions
  • Update the information for
  • Implement test method to be executed for the given time using config file. For example, the scripts have to be executed 60/90/120 minutes
  • Implement user frequency. For exmple, if Sales Order should be created 15 times per user, then script should be stopped after creating 15 Sales order for each user
  • Capture screenshots if test is failed for any user. Keep the images in shared path. Also create less size image.
  • Modify the script, which is creating more transactions than expected
  • Validate each transaction time as whether only applicaiton time OR included Coded UI time as well
  • Keep data files in the shared path for ease of use and maintenance
  • Use Visual Studio Load test to collect performance counters from various servers like AOS servers, AX Batch servers, AX Integration servers, Terminal servers etc


Batch files to execute CodedUI Tests
Always create DLLs in Release mode instead of Debug mode. It will improve system's resource utilization.
echo on set logfile=C:\AX_PerformanceTest\ExecSmallSO.log echo "%date% - %time%- MediumSO started" cd\ cd "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow" vstest.console.exe "C:\AX_PerformanceTest\DLL\AX_PerformanceTest.dll" /Tests:CreateSmallSaleOrder pause


Sample Coded UI methods
Scenarios should be carefully implemented as it is for performance testing.
public void clickSellTab() { axAppWindow.SetFocus(); myutils.ClickButton(axAppWindow, "SellTab"); } public void ClickBookMenu(ref Logger.Logger LogObj) { try { axAppWindow.SetFocus(); WinMenuItem item = new WinMenuItem(axAppWindow); item.SearchProperties[WinMenuItem.PropertyNames.Name] = "Book"; UITestControlCollection ab = item.FindMatchingControls(); LogObj.TransactionStart("BookingOrder"); Mouse.Click(item); Thread.Sleep(1000); this.clickSellTab(); // axAppWindow.WaitForControlEnabled(50000); axAppWindow.SetFocus(); item.WaitForControlReady(50000); LogObj.TransactionEnd("BookingOrder"); } catch (Exception ex) { LogObj.LogException(LogObj.ScenarioName, ex.ToString()); } }


Few links related to Coded UI Tests


Monday, January 30, 2012

Performance Testing Guidance from Microsoft

I always admire Microsoft for its documentation. Right from OS to Office applications, you can find good documentation. Microsoft has shared very good articles for performance testing and tuning.

You can find more articles related to Performance testing available from
Patterns & Practices: Performance Testing Guidance
. Just listed only How-TOs part. Thanks to Microsoft!!!

Performance Testing


Capacity

Load Testing

Stress Testing

Test Cases/Scripting

Troubleshooting

Tuning

Workload Modeling

Hope all these links would be useful..

Sunday, January 29, 2012

JVM Monitoring

Couple of our applications are using Tomcat server. JMeter was used as load testing tool. To monitor Java memory, I used two options. One is JMX(Java Management Extensions) console and Psi-Probe.

To implement those options, you should add following entries into catalina.sh

set CATALINA_OPTS=-Dcom.sun.management.jmxremote \
-Dcom.sun.management.jmxremote.port=9005 \
-Dcom.sun.management.jmxremote.ssl=false \
-Dcom.sun.management.jmxremote.authenticate=false

To access JMX console,Enter command ==> [java_installation]\bin\jconsole hostname:port
To use Probe, you need to deploy in tomcat server instance and then access the probe application.

Sunday, December 4, 2011

Skills for Performance Testing

Earlier I have written about Skills for Automation. Here I attempted to write about the skills for Performance Testing. Because performance testing is very dynamic and has vast areas in different technologies and concepts. It requires different level of of depth knowledge depends on project requirements and environment. It required different technical and non-technical stuff on the job.

Skills for Performance Testing

  • Analytical skills
  • Scripting
  • Operating System Concepts
  • Networking concepts
  • Memory Management concepts
  • Database Concepts
  • Hardware concepts
  • Protocols concepts
  • Debugging
  • Log Analysis
  • Documentation

Suggesting following links...
SQAForums's post - Need opinion from exp. load testers- skillset
Technical Skills For Performance Testers
What Skills Performance Testers Need and How to Get Them?

Saturday, November 26, 2011

JMeter - JDBC Driver issue

I was trying JDBC query execution through JMeter. I was getting the error like java.sql.SQLException: No suitable driver found . I have given ojdbc6.jar on classpath and installed Oracle client. Still JMeter has thrown error. The solution is, ojdbc6.jar should be copied under JMeter Lib folder (<JMeter installation directory>\lib).

Later, the same script was copied into a server and executed the script. Got the same error. Server does not have Oracle client installation and ojdbc6.jar in JMeter lib folder. Just copied the ojdbc6.jar and able to run the script successfully.
Note:
ojdbc6.jar should be used for Oracle 11 version. Also it is supported only Java1.5 and above.

Tuesday, July 19, 2011

Virtualized Servers for Performance Testing

We have to do benchmark testing as well as stress testing. Already high performance system was purchased and virtual machines are created for performance testing setup. I can see few limitations for this approach, since production boxes all are in physical servers. Earlier I've used virtual machines for automation regression execution.

Issues/Limitations of Virtualized Servers

  1. if (production environment != load test environment) then result could be wrong
  2. Increasing the risk of deployment failure

Did a google search and found following links, which give all the details, including Vmware claims.

Virtualization: Performance testing

Measuring the Performance Impact of Virtualizing a Web Application Server

Response to 'Load Testing a Virtual Web Application'

Virtualization performance testing tips


Anybody have experienced load testing on Virtualized servers? Please share your experience...

Sunday, July 10, 2011

JMeter Tip - javascript eval

For my jmeter script, I was trying to set a javascript statement for this formula --> pitia=${cpi}+${cti}+(${eshortage}/60)
I used the code like this --> ${__javaScript( eval ('${cpi}'.concat('+'\,'${cti}'\,'+'\,'${eshortage}'\,' /60' ) ) )}
The above one was working in a simple script and it is not worked in my business script.

In jmeter.log, I got the exception like below:
2011/07/07 17:30:30 ERROR - jmeter.functions.JavaScript: Error processing Javascript org.mozilla.javascript.EcmaError: SyntaxError: missing ; before statement (<cmd>#1(eval)#1)
at org.mozilla.javascript.ScriptRuntime.constructError(ScriptRuntime.java:3229)
at org.mozilla.javascript.DefaultErrorReporter.error(DefaultErrorReporter.java:78)
at org.mozilla.javascript.Parser.addError(Parser.java:126)
at org.mozilla.javascript.Parser.reportError(Parser.java:132)
at org.mozilla.javascript.Parser.statementHelper(Parser.java:1175)
at org.mozilla.javascript.Parser.statement(Parser.java:623)
at org.mozilla.javascript.Parser.parse(Parser.java:355)
at org.mozilla.javascript.Parser.parse(Parser.java:293)

The reason for failure is a simple space, which was given for variable 'cpi'. The jmeter error did not indicate the right reason. You should debug each variable.

Wednesday, February 27, 2008

Regression Testing

Regression Testing:It retests previously tested segments to ensure that they still function properly after a change has been made to another part of the application.

Objectives:

  • It determines whether systems documentation remains current
  • It determines that system test data and test conditions remain current.
When do use regression testing:
When there is a high risk that new changes may affect unchanged areas of the application system.

Performance testing:
In this testing that the system will perform as specified at predetermined levels,wait times,static processes,dynamic processes and transaction processes.It is also tested at the client/browser and server levels.

Example:

  • In database systems the response time relates to the time to obtain a report after clicking on a specific button.It may be difficult to specify the response times for each and every form/Report,but the time that can be generally specified,for example 30 seconds is reasonable.
  • In real time or embedded systems the performance parameters are very significant.If the system demands,if the temperature exceeds 40 degrees then the valve will be opened with in 10milliseconds.Here this performance requirement is not met it may result in a catastrophe.
Hence performance testing is compulsory in process control and telecommunication software systems.

Sanity Testing:

Sanity testing is a cursory testing; it is performed whenever a cursory testing is sufficient to prove the application is functioning according to specifications. This level of testing is a subset of regression testing. It normally includes a set of core tests of basic GUI functionality to demonstrate connectivity to the database, application servers, printers, etc.

Smoke Testing:

Smoke testing is non-exhaustive software testing, ascertaining that the most crucial functions of a program work, but not bothering with finer details.


Monday, February 18, 2008

Tip1 - To measure the App or System performance

Using Windows Performance monitor, we can collct performance data automatically from local or remote computers. Below steps will give you, how to set the counter in your machine or server. Depends upon the options, user can set any type of counter.

Steps:
1. GoTo Start->Run
2. Type "PerfMon" and Click OK button.
3. It will open performance app window.
4. GoTO Performance Logs and Alerts-> Counter Logs.
5. In the Right side Pane, right click on empty space and select 'New Log Settings'.
6. Enter the name for the log file.
7. In General Tab, you can see log file name. Click Add counter button.
8. Select that computer, 'Processor' as performance Object, '%Proccessor Time' as selected counter from list.
9. Also 'Total' as selected instance from list and click Add button.
10. Again select 'Memory' as performance Object, 'Availabile MBytes' as selected counter from list.
11. click Add button and close button. Set interval period.
12. Goto LogFiles TAB. Give log files path. Select 'Text File - CSV' as your log type and click apply button.
13. GoTo Schedule TAB and select manually.
14. Now setting entry has been created. To Run, Select that entry and Right click and select Start, before running your application.