060 Flex Fds Performance Test White Paper

18
Herrlich & Ramuschkat GmbH Vahrenwalder Strasse 156 D-30165 Hannover, Germany Phone +49 (0)511 - 59 0 95 - 0 Fax – 590 White Paper Flex Data Services Performance Tests Version 1.0 – January 19 t h  2007 Sven Ramuschkat Dirk Eismann © 2007 HERRLICH & RAMUSCHKAT GmbH http://www.richinternet.de  http://www.richinternet.de/blog http:// www.flexperten.de  http://www.flexforum.de  

Transcript of 060 Flex Fds Performance Test White Paper

Page 1: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 1/18

Herrlich & Ramuschkat GmbHVahrenwalder Strasse 156D-30165 Hannover, GermanyPhone +49 (0)511 - 59 0 95 - 0 Fax – 590

White PaperFlex Data Services Performance TestsVersion 1.0 – January 19 t h 2007

Sven Ramuschkat

Dirk Eismann

© 2007 HERRLICH & RAMUSCHKAT GmbH

http://www.richinternet.de

http://www.richinternet.de/blog

http:// www.flexperten.de

http://www.flexforum.de

Page 2: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 2/18

2

1. Table Of Contents

1. Table Of Contents 2

2. Introduction 3

3. Test Environment 3

4. Performance Test #1: Remote Object Service 5

4.1. MXML-Test File 5

4.2. Server Side Business Logic 6

4.3.

XML Configuration File: remoting-config.xml 7

4.4. Test Results 7

5. Performance Test #2: Messaging Service 8

5.1. MXML-Test File 8

5.2. XML Configuration File: messaging-config.xml 10

5.3. Test Results 10

6. Performance Test #3: Data Management Service 11

6.1. MXML-Test File: Data Service Producer Application 11 6.2. MXML-Test File: Data Service Consumer Application 14

6.3. Server Side Business Logic 16

6.4. XML Configuration Files 16

6.5. Results 18

7. Conclusion 18

Page 3: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 3/18

Flex Data S ervices Performance Tests White Paper

3

2. Introduction

On November 30th 2006, Adobe released the Flex Stress Testing Framework on Adobe Labs1

The Flex Stress Testing Framework – as the name implies – allows you to test and measure theperformance of a Flex Data Services (FDS) powered application. The framework provides an API andsome tools to easily setup new test scenarios.

The Flex Data Services (FDS) product is available in three Editions:

• The free Express Edition , which is limited to 1 physical CPU (Dual Core counts as 1 CPU).No restrictions concerning maximum connections / users. Cannot be clustered.

• The Department Edition , needs one license per physical CPU. Can be clustered but has amaximum of 100 concurrent connections / users in total (i.e. even with 3 CPUs only 100concurrent users can be managed)

• The Enterprise Edition, also needs one license per CPU, no further limitations.

In order to give a good estimate about how many licenses are needed for a given customer projectit is essential to know how the Flex Data Services will perform and scale.

On the same topic, Adobe released a white paper called „Flex Data Services 2: Capacity planning“ 2

Adobe’s white paper first gives an overview on the core Flex Data Services architecture and thendiscusses how to tune certain aspects of a FDS powered application. Finally, the results of some loadand stress tests are shown. After studying this whitepaper you probably still do not know how a FDS powered applications thatuse Remote Objects, Messaging and the Data Management feature will actually perform and scale.

Maybe you already have a good idea about the kind of data your application will produce and howmany concurrent connections are needed and you only need some real test results. Maybe you justwant to check out the total amount of concurrent connections FDS can handle. Maybe you are justcurious.

We have created several test setups for the Flex Stress Testing Framework that can be used as guidefor your own tests. The tests are explained in more detail in the next sections.

Of course, not every test here may be applicable for your project and your requirements – or toquote from Adobe’s white paper: “ No two applications are exactly alike, and there are many factorsthat may impact performance” but they should give you a good idea about FDS’s capabilities.

3. Test Environment

The following configuration was used on the server side:

1 x IBM eServer 336, 3.2 GHz XEON (1 CPU, Single Core), 2 GB RAM, 140 GB RAID 1 HDD

Windows Server 2003, Standard Edition R2• Apache TomCat 5.5 (using Suns JDK 1.5 JRE)

• Java Open Transaction Manager (JOTM)

• Flex Data Services 2 (Express Edition)

• HSQLDB (in memory database)

• jTDS 1.2 SQLServer JDBC Driver

1 http://labs.adobe.com/wiki/index.php/Flex_Stress_Testing_Framework 2

http://www.adobe.com/products/flex/whitepapers/pdfs/flex2wp_fdscapacityplanning.pdf

Page 4: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 4/18

Flex Data S ervices Performance Tests White Paper

4

In addition, to use the Flex Stress Testing Framework you need two different client roles: one ormore clients that make requests against FDS using a remote controlled Flex application (i.e. theapplication that implements a test) and a client that runs the main controller console (i.e. themachine that starts / stops the test run and also measures the performance)

Each client that acts as a test runner instantiates one or more browsers and in turn loads the testapplication into the browser – so in effect, with 4 clients configured to start 10 browsers each 40concurrent connections to FDS can be created.

For the test runner we used

4 x Acer Notebooks, Intel Dual Core 1.6 GHz, 1.5 GB RAM, 60 GB HDD

• Windows XP Professional, SP2

• Internet Explorer 7 with Flash 9 Player

• Suns 1.5 JRE

Every test runner hosts the BrowserServer application that comes with the Stress Testing Frameworkand can be controlled by the computer hosting the TestAdmin-Console:

1 x IBM Thinkpad T40, Intel Pentium M 1.6 GHz, 2 GB RAM• Windows XP Professional, SP2

• Internet Explorer 7 with Flash 9 Player

This computer also hosts a data producing application that is used in the Data Management testbelow.

Page 5: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 5/18

Flex Data S ervices Performance Tests White Paper

5

4. Performance Test #1: Remote Object Service

The Remote Object test uses a test application based on the Flex Stress Testing Framework. The testapplication uses the RemoteObject class to connect an FDS destination. The destination uses thedefault AMF3-over-HTTP channel of a default installation and the JavaAdapter to invoke thebusiness logic. The business logic is exposed as a Java class and connects to a MS SQL Server 2000database containing 20.000 records of census data. The database is running on a remote machinein the same local network. This test was done with a result set size of 50 and 100 rows.

4.1. MXML-Test File

This test calls a method on the remote Java class. After a result or fault event has been received, theapplication invokes the same remote method call again. For every result and fault a counter getsincreased. The throughput is calculated based on these two values after the test is finished.

<mx: Appl i cat i on xml ns: mx= http://www.adobe.com/2006/mxml cr eat i onCompl ete="r un( ) ">

<mx: Scr i pt >

<! [ CDATA[

pr i vat e var count : i nt = 0;

pr i vate var f ai l ureCount : i nt = 0;

/ / ever y str ess t est wi l l need t o cr eat e an i nst ance of Part i ci pant

pr i vat e var p:Part i c i pant ;

publ i c f uncti on r un( ) : voi d {

/ / t he f r amewor k passes i n t he uni que name or i d

/ / of t he Par t i ci pant at r unt i mevar i d: St r i ng = Appl i cat i on. appl i cat i on. paramet ers . i d;

/ / pr opert y speci f yi ng whet her or not t hi s t est shoul d be/ / managed by t he LoadRunner appl i cat i on.

var moni t or ed: Bool ean = ( i d ! = nul l ) ;

var name: St r i ng = i d;

/ / create a Par t i ci pant i nst ance passi ng i n the uni que name/ / and whether or not t he t est i s managed.

p = new Par t i ci pant ( name, moni t ored) ;

/ / set up event l i st ener s f or st ar t i ng and st oppi ng t he t est

Par t i ci pant . event di spatcher . addEvent Li st ener ( "st ar t Request ", st art Test ) ;

Par t i ci pant . event di spatcher . addEvent Li st ener ( "t i meUp", st opTest ) ;

}

/ / t hi s i s cal l ed by the t est f r amewor k. From her e you cal l/ / s tar t Test on t he part i c i pant

pr i vat e f unct i on s tar t Test( e: Obj ect) : voi d {

p. st ar t Test ( st ar t Remot eObj ectTest , e. dur ati on) ;

}

pr i vat e f unct i on st art RemoteObj ect Test ( ) : voi d {

cal l Remot eMethod( ) ;

}

pr i vat e f unct i on cal l RemoteMet hod( ) : voi d {

Page 6: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 6/18

Flex Data S ervices Performance Tests White Paper

6

/ / 50 = number of r ows t o f etch f r om t he dat abaseser vi ce. get El ement s( 0, 50) ;

}

/ / st op t est i s cal l ed by the f r amework

pr i vat e functi on st opTest ( e: Obj ect) : voi d {

i f ( p. t estCount == 0)

{

/ / when t he f r amework cal l s st opTest we set t he st op t i me

p. t estStopAt = get Ti mer ( ) ;

/ / updat e the t est Count pr opert y wi t h t he number/ / of successf ul requests .

p. t est Count = count ;

/ / updat e the f ai l ur eCount propert y wi t h t he number/ / of f ai l ed reques t s .

p. f ai l ureCount = f ai l ur eCount ;

}

}

pr i vat e f uncti on onResul t ( ) : voi d {

/ / i ncr ease success count er and cal l r emot e method agai n

count ++;

cal l Remot eMethod( ) ;

}

pr i vat e f uncti on onFaul t ( ) : voi d {

/ / i ncr ease f ai l ure count er and cal l r emot e method agai n

f ai l ur eCount++;

cal l Remot eMethod( ) ;

}

] ] >

</ mx: Scr i pt >

<mx: Remot eObj ect

i d="ser vi ce" dest i nati on="census"

r esul t ="onResul t ( ) "

f aul t ="onFaul t ( ) "

/ >

</ mx: Appl i cati on>

4.2. Server Side Business Logic

The server side business logic is written in Java. We used the Java classes from Adobe’s Censusapplication that comes with the FDS installation. The server side business Logic uses the jTDS 1.1

Page 7: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 7/18

Flex Data S ervices Performance Tests White Paper

7

JDBC Driver to connect to a MS SQL Server 2000 where the census data is stored. The census tablecontains 20.000 rows with the following data structure:

4.3. XML Configuration File: remoting-config.xml

There is nothing special about this configuration. It uses the my-amf AMF channel.

<dest i nat i on i d="census"> <channel s>

<channel r ef =" my- amf " / >

</ channel s>

<adapt er r ef ="j ava- obj ect "/ >

<propert i es>

<sour ce>sampl es. census. CensusSer vi ce</ sour ce>

</ pr oper t i es>

</ dest i nat i on>

4.4. Test Results

Several test runs were made using an increasing amount of browser instances. Run time per testwas 60 seconds. We measured the total number of successful requests and used the Windows TaskManager application to get an idea of the CPU load. The best message throughput is indicated inorange .

RemoteObject Test, Census DB, Result set size: 100 rows# of

Browser-Servers

# of Browserinstances

Total # ofBrowsers

Total # ofresult sets

# of resultsets/sec.

% CPUload

4 1 4 3900 65,00 30-404 5 20 5878 97,96 40-804 10 40 12701 211,68 80-904 15 60 12240 204,00 80-904 20 80 8541 142,35 80-90

Page 8: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 8/18

Flex Data S ervices Performance Tests White Paper

8

RemoteObject Test, Census DB, Result set size: 50 rows# of

Browser-Servers

# of Browserinstances

Total # ofBrowsers

Total # ofresult sets

# of resultsets/sec.

% CPUload

4 1 4 4583 76,38 30-404 5 20 18969 316,15 60-804 10 40 16468 274,46 70-954 15 60 14986 249,76 70-954 20 80 13927 232,11 70-95

5. Performance Test #2: Messaging Service

The Messaging Services test uses a test application based on the Flex Stress Testing Framework. Thetest application subscribes as a consumer to a FDS messaging destination and also acts as amessage publisher. This setup uses the FDS ActionScriptAdapter (i.e. no additional server sideinfrastructure like JMS).

5.1. MXML-Test File

The test application creates uses a Producer and Consumer to send messages to FDS and to receivebroadcasted messages from FDS. After a message is received from the network the applicationsends another message. Every Message correctly sent and received increases a counter.

<mx: Appl i cat i on xml ns: mx= http://www.adobe.com/2006/mxml cr eat i onCompl ete="r un( ) ">

<mx: Scri pt >

<! [ CDATA[

i mpor t mx. messagi ng. messages. *;

i mport mx. messagi ng. event s. *;

i mpor t mx. ut i l s . Obj ectUt i l ;

pr i vat e var count : i nt = 0;

pr i vate var f ai l ureCount : i nt = 0;

pr i vat e var p:Part i c i pant ;

publ i c f uncti on r un( ) : voi d {

var i d: St r i ng = Appl i cat i on. appl i cat i on. paramet ers . i d;

var moni t or ed: Bool ean = ( i d ! = nul l ) ;var name: St r i ng = i d;

p = new Par t i ci pant ( name, moni t ored) ;

Par t i ci pant . event di spatcher . addEvent Li st ener ( "st ar t Request ", st ar t Test ) ;

Par t i ci pant . event di spatcher . addEvent Li st ener ( "t i meUp", st opTest ) ;

}

pr i vat e f unct i on s tar t Test( e: Obj ect) : voi d {

consumer . subscr i be( ) ;

p. st ar t Test ( st ar t Messagi ngTest , e. dur at i on) ;

}

Page 9: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 9/18

Flex Data S ervices Performance Tests White Paper

9

pr i vat e f uncti on star t Messagi ngTest ( ) : voi d {

sendMessage( ) ;

}

pr i vat e functi on st opTest ( e: Obj ect) : voi d {

i f ( p. t est Count == 0) {

p. t estStopAt = get Ti mer ( ) ;

p. t est Count = count ;

p. f ai l ureCount = f ai l ur eCount ;

}

}

pr i vat e f unct i on sendMessage( ) : voi d {

var mess age: AsyncMessage = new AsyncMess age( ) ;

message. body = " Aut omessage";

producer . send( message) ;

count ++;

}

pr i vat e f unct i on onResul t ( event : MessageEvent) : voi d {

l og. t ext += event . message. body + "\ n";

l ogr aw. t ext += "Message: " + Obj ectUt i l . t oSt r i ng( event . message) + "\ n";

count ++;

sendMessage( ) ;

}

pr i vat e f uncti on onFaul t ( ) : voi d {

l og. t e xt += " Er r or " + " \ n" ;

l ograw. t ext += "Message: " + "Fehl er " + "\ n";

f ai l ur eCount++;

sendMessage( ) ;

}

] ] >

</ mx: Scr i pt >

<mx: Pr oducer i d=" producer " dest i nat i on=" myChat " / >

<mx: Consumer i d=" consumer " dest i nat i on=" myChat "

message=" onResul t ( event ) "

f aul t ="onFaul t ( ) "

/ >

<mx: Text Ar ea i d=" l og" wi dt h=" 100%" hei ght =" 100%" edi t abl e=" f al se"/ >

<mx: Text Ar ea i d=" l ogr aw" wi dth="100%" hei ght ="100%" edi t abl e="f al se"/ >

</ mx: Appl i cati on>

Page 10: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 10/18

Flex Data S ervices Performance Tests White Paper

10

5.2. XML Configuration File: messaging-config.xml

<dest i nat i on i d=" myChat " >

<adapt er r ef ="act i onscr i pt " / >

<channel s>

<channel r ef =" my- r t mp"/ >

</ channel s>

<propert i es>

<net wor k>

<sessi on- t i meout >20</ sessi on- t i meout >

<t hrot t l e- i nbound pol i cy=" ERROR" max- f r equency=" 0"/ >

<t hrot t l e- out bound pol i cy=" REPLACE" max- f r equency=" 0"/ >

</ net wor k>

<ser ver >

<max- cache- si ze>1000</ max- cache- si ze>

<message- t i me- t o- l i ve>0</ message- t i me- t o- l i ve>

<dur abl e>t r ue</ dur abl e>

<dur abl e- st or e- manager >

f l ex. messagi ng. dur abi l i t y. Fi l eSt or eManager

</ dur abl e- st or e- manager >

<max- f i l e- si ze>200K</ max- f i l e- si ze>

<batch-wr i t e- si ze>10</ batch- wr i t e-s i ze>

</ server >

</ pr oper t i es>

</ dest i nat i on>

5.3. Test Results

Several test runs were made using an increasing amount of browser instances. Run time per testwas 60 seconds. We measured the total number of successful sent and received messages and usedthe Windows Task Manager application to get an idea of the CPU load. The best messagethroughput is indicated in orange .

Messaging Test, every client sends and receives a message # of

Browser-Servers

# of Browserinstances

Total # ofBrowsers

Total # ofmessages

# ofmessages/sec.

% CPUload

4 1 4 12556 209,26 15-204 5 20 35930 598,83 95-1004 10 40 55808 930,13 95-1004 15 60 63987 1066,45 95-100

Page 11: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 11/18

Flex Data S ervices Performance Tests White Paper

11

6. Performance Test #3: Data Management Service

The Data Management Service Test uses two applications. The first application is based on the FlexStress Testing Framework and acts as data consumer. The consuming application connects to aData Management destination and fills a client side ArrayCollection.

The second application is a simple Flex application (not a test run by the framework) that acts as adata manipulating application. The application connects to the same destination andautomatically invokes create, update and delete on the managed ArrayCollection.

6.1. MXML-Test File: Data Service Producer Application

The data manipulating application runs on a separate PC and fills an ArrayCollection by using aDataService instance. It then automatically updates, creates and deletes records in theArrayCollection and calls commit() on the DataService. It does so in a sequential manner andrestarts the update / create / delete process after the last delete operation returns successfully.

<mx: Appl i cat i on xml ns: mx=" ht t p: / / www. adobe. com/ 2006/ mxml "

l ayout ="vert i cal "

cr eat i onCompl ete="i ni t ( ) ">

<mx: Scri pt >

<! [ CDATA[

i mport mx. col l ect i ons. I t emResponder ;

i mpor t mx. r pc. AsyncToken;

[ Bi ndabl e]pr i vat e var f aul t s :u i nt ;

[ Bi ndabl e]

pr i vat e var resul t s : ui nt ;

pr i vat e var t i mer: Number ;

pr i vat e var i sRunni ng: Bool ean;

pr i vat e f unc t i on i ni t ( ) : voi d {

i f ( parameter s. aut oRun && parameter s. aut oRun. t oLowerCase( ) == "t r ue") {

/ / i s an URL par ameter aut oRun=t r ue t hen st art aut omat i cal l y/ / other wi se t he st art But t on has to be cl i cked

s t a r t ( ) ;

}

}

pri vat e f unct i on s tar t ( ) : voi d {

/ / st art t he data mani pul ati on pr ocess

r el as e( ) ;

t i mer = get Ti mer( ) ;

i sRunni ng = t r ue;

f aul t s = 0;

r es ul t s = 0;

Page 12: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 12/18

Flex Data S ervices Performance Tests White Paper

12

met r i cs . t e xt = " " ;

i t ems = new Ar r ayCol l ecti on( ) ;

addResponder ( ds . f i l l ( i t ems) , " f i l l " ) ;

}

pr i vat e f uncti on st op( ) : voi d {

/ / st op t he dat a mani pul at i on pr ocess

i sRunni ng = f al se;

t i mer = get Ti mer ( ) - t i mer ;

met r i c s. t ext = " Tot a l : " + ui nt ( r es ul t s + f aul t s ) . t o St r i ng( ) + "

messages r ecei ved i n " + t i mer + " ms";

}

pr i vat e funct i on rel ase( ) : voi d {

try {

ds. r el easeCol l ect i on( i t ems, t rue) ;

} cat ch ( e: Er r or) {}

}

pr i vat e f unct i on addResponder ( t oken: AsyncToken, name: Str i ng) : voi d {

t oken. addResponder( new I t emResponder( r esul t , f aul t , name)) ;

}

pr i vat e f uncti on cr eateI t em( ) : Obj ect {

/ / cr eat e a new empt y r ecor dr et ur n get I t em( ) ;

}

pr i vat e f unct i on updateI t em( o: Obj ect ) : Obj ect {

/ / updat es an i t em and r etur ns t he new ver si on

var n: Obj ect = getI t em( o. PRODUCT_I D) ;

var p:Str i ng;

f or (p i n o) {o[p] = n[p] ; }

r et ur n o;

}

pr i vat e f unct i on get I t em( i d: i nt = - 1) : Obj ect {

/ / creates a new i t em

var o: Obj ect = {};

o. DESCRI PTI ON = get Text ( 100) ;

o. NAME = getText ( 12);

o. PRI CE = ( Mat h. r andom( ) * 1000) ;

o. QTY_I N_STOCK = Mat h. f l oor ( Mat h. r andom( ) * 500);

Page 13: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 13/18

Flex Data S ervices Performance Tests White Paper

13

i f ( i d > - 1) {

o. CATEGORY = " NEW123";

o. PRODUCT_I D = i d;

}r et ur n o;

}

pr i vat e f uncti on get Text( char s: ui nt ) : St r i ng {

/ / r et ur ns a random st r i ng of a gi ven l engt h

var s t r : St r i ng = " " ;

var i : i nt ;

f or ( i = 0; i < c har s ; i ++) {

st r += St r i ng. f r omCharCode(65 + Math. cei l ( Math. r andom( ) * 25) ) ;

}re turn s t r ;

}

pri vat e f unct i on resul t ( dat a: Obj ect , t oken: St r i ng) : voi d {

/ / process an i ncomi ng r esul t . Next acti on depends on t he t oken st r i ng

var i t em: Obj ect

r esul t s++;

i f ( i sRunni ng) {

i f ( t oken == "f i l l " | | t oken == "del et e" ) {

i f ( i t ems . l ength > 0) {

i t em = i t ems. get I t emAt ( Math. cei l ( Mat h. r andom( ) * i t ems. l engt h- 1) ) ;

updat eI t em( i t em) ;

addResponder( ds. commi t ( ) , "updat e") ;

}

} el se i f ( t oken == "update") {

i t ems. addI t em( cr eateI t em( ) ) ;

addResponder ( ds. commi t ( ) , "cr eat e") ;

} el se i f ( t oken == "cr eat e") {

i f ( i t ems . l ength > 0) {

i t em = i t ems. get I t emAt ( Mat h. cei l ( Math. r andom( ) * i t ems. l engt h- 1) ) ;

ds. del et eI t em( i t em) ;

addResponder( ds. commi t ( ) , "del et e") ;

}

}

} el se {

re l ease( ) ;

}

}

Page 14: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 14/18

Flex Data S ervices Performance Tests White Paper

14

pri vat e f unct i on faul t ( dat a: Obj ect , t oken: St r i ng) : voi d {

f aul t s++;

}] ] >

</ mx: Scr i pt >

<mx: Dat aServi ce i d=" ds" dest i nat i on=" j dbc- t est " aut oCommi t ="f al se"/ >

<mx: Ar r ayCol l ect i on i d="i t ems"/ >

<mx: HBox>

<mx: But t on l abel ="Start " cl i ck="st art ( ) "/ >

<mx: But t on l abel ="St op" cl i ck="st op( ) "/ >

<mx: Label t ext ="Resul t s: {r esul t s}"/ >

<mx: Label t ext ="Faul t s: {f aul t s}"/ >

<mx: Label i d=" metr i cs" / >

</ mx: HBox>

<mx: Dat aGr i d i d=" dg" wi dt h=" 100%" hei ght =" 100%" dat aPr ovi der =" {i t ems}" / >

</ mx: Appl i cat i on>

6.2. MXML-Test File: Data Service Consumer Application

The data consuming application is based on the Flex Stress Testing Framework. The test applicationfills an ArrayCollection by using a DataService instance. For maximum performance, thisapplication does not use any user interface.

Every message received from the FDS destination increases a counter (depending on the type ofmessage, either result or fault)

<mx: Appl i cat i on xml ns: mx= http://www.adobe.com/2006/mxml cr eat i onCompl ete="r un( ) ">

<mx: Scri pt>

<! [ CDATA[

i mpor t mx. dat a. messages. Dat aMessage;

i mpor t mx. messagi ng. event s. MessageEvent ;

i mpor t mx. messagi ng. messages. Er r or Message;

pr i vat e var count : i nt = 0;

pr i vate var f ai l ureCount : i nt = 0;

pr i vat e var p:Part i c i pant ;

publ i c f uncti on r un( ) : voi d {

var i d: St r i ng = Appl i cat i on. appl i cat i on. paramet ers . i d;

var moni t or ed: Bool ean = ( i d ! = nul l ) ;

var name: St r i ng = i d;

p = new Par t i ci pant ( name, moni t ored) ;

Page 15: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 15/18

Flex Data S ervices Performance Tests White Paper

15

Par t i ci pant . event di spat cher . addEvent Li st ener ( "st ar t Request ", st ar t Test ) ;

Par t i ci pant . event di spat cher . addEvent Li st ener ( "t i meUp", st opTest ) ;

}

pr i vat e f uncti on st art Test ( e: Obj ect) : voi d {

p. s tar t Test( s tar t Dat aServi ceTest , e. durat i on) ;

}

pr i vat e f uncti on st art Dat aSer vi ceTest ( ) : voi d {

i t ems = new Ar r ayCol l ecti on( ) ;

ds. f i l l ( i t ems) ;

}

pr i vat e f uncti on st opTest ( e: Obj ect) : voi d {

i f ( p. t estCount == 0) {ds. rel easeCol l ect i on( i t ems, t rue) ;

p. t est St opAt = get Ti mer ( ) ;

p. t est Count = count ;

p. f ai l ureCount = f ai l ureCount ;

}

}

pr i vat e f unct i on message(event : MessageEvent) : voi d {

i f ( event . message i s Err orMessage) {

f ai l ureCount ++;} el se i f ( event . message i s Dat aMessage) {

count ++;

}

}

] ] >

</ mx: Scr i pt >

<mx: Dat aServi ce

i d="ds"

dest i nat i on="j dbc- t est "

autoCommi t =" f al se"

message=" mess age(event ) "

/ >

<mx: Ar r ayCol l ect i on i d="i t ems"/ >

</ mx: Appl i cat i on>

Page 16: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 16/18

Flex Data S ervices Performance Tests White Paper

16

6.3. Server Side Business Logic

For this test we used the FDS JDBCAssembler by Christophe Coenraets 3. By using the

JDBCAssembler you can easily connect FDS to JDBC data sources and do simpile CRUD operations.The JDBCAssembler connects to an HSQLDB database on the same server. The PRODUCT tablecontains 100 record sets. The table structure is as follows:

Table PRODUCT:

Column name Type

PRODUCT_ID INTEGER

NAME VARCHAR(40)

CATEGORY VARCHAR(40)

IMAGE VARCHAR(40)

PRICE DOUBLE

QTY_IN_STOCK INTEGER

DESCRIPTION VARCHAR(255)

6.4. XML Configuration Files

6.4.1. services-config.xml

<channel - def i ni t i on i d="r t mp- j dbcassembl er"

cl ass=" mx. messagi ng. channel s. RTMPChannel " >

<endpoi nt uri ="r t mp: / / f ds: 2037"

cl ass=" f l ex. messagi ng. endpoi nts. RTMPEndpoi nt" / >

<pr opert i es>

<i dl e- t i meout- mi nutes>20</ i dl e- t i meout- mi nut es>

<cl i ent - t o- server - maxbps>100K</ cl i ent - t o- server - maxbps>

<server - t o- cl i ent - maxbps>100K</ server - t o- cl i ent - maxbps>

</ pr oper t i es>

</ channel - def i ni t i on>

3

http://coenraets.org/blog/2006/11/building-database-driven-flex-applications-without-writing-client-or-server-side-code/

Page 17: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 17/18

Flex Data S ervices Performance Tests White Paper

17

6.4.2. data-management-config.xml

<dest i nat i on i d="j dbc- t est " channel s=" r t mp- j dbcassembl er" adapt er="j ava- dao">

<pr opert i es>

<source>f l ex. sampl es. assembl ers . Si mpl eJ DBCAssembl er </ source>

<scope>appl i cat i on</ scope>

<met adat a>

<i denti t y proper t y=" PRODUCT_I D" / >

</ met adat a>

<dat abase>

<dr i ver >org. hsql db. j dbcDr i ver </ dr i ver >

<ur l >j dbc: hsql db: / Tomcat 5. 5/ f l ex/ j dbcassembl er / db/ f l exdemo</ ur l >

<t abl e>product </ t abl e>

<aut oi ncr ement >t r ue</ autoi ncr ement >

</ dat abase>

</ pr opert i es>

</ dest i nat i on>

Page 18: 060 Flex Fds Performance Test White Paper

8/12/2019 060 Flex Fds Performance Test White Paper

http://slidepdf.com/reader/full/060-flex-fds-performance-test-white-paper 18/18

Flex Data S ervices Performance Tests White Paper

18

6.5. Results

Several test runs were made using an increasing amount of browser instances. Run time per testwas 60 seconds. We measured the total number of successful received messages (i.e. FDSnotifications about updated records) and used the Windows Task Manager application to get anidea of the CPU load. The best message throughput is indicated in orange .

Data Management Test with JDBC-Assembler and HSQL# of

Browser-Servers

# of Browserinstances

Total # ofBrowsers

Total # ofmessages

# ofmessages/sec.

% CPULoad

4 1 4 1190 19,83 01-024 5 20 5970 99,50 04-10

4 10 40 7223 120,38 05-154 15 60 13620 227,00 05-154 20 80 14619 243,65 05-15

7. Conclusion

By using the Flex Stress Testing Framework we were able to get a good idea about general FDSperformance. Of course, not every test we did necessarily resembles a real world situation.Especially when using the Data Management Service you typically use nested relations to mapdatabase relations. You may also want to use paging and lazy loading which adds additionaloverhead to FDS – we haven’t tested this. Depending on your requirements, your mileage will vary.

Our focus was to measure the maximum amount of messages FDS is able to handle per second. As you see from the result tables there is no rule of thumb but the overall performance seems prettygood.