1
March 11, 2010March 11, 2010
SoonwookSoonwook HwangHwang
KISTI, KoreaKISTI, Korea
Grid Activities in KISTIGrid Activities in KISTI
OutlineOutline
� Grid Operation and Infrastructure
� KISTI ALICE Tier2 Center
� FKPPL VO: Production Grid Infrastructure
� Grid Developments in collaboration with
EGEE
� AMGA
� Ganga
� WISDOM
Evolution of KISTIEvolution of KISTI--ALICE CollaborationALICE Collaboration
� September, 2006� Dr. Federico Carminati visited KISTI to discuss the construction of ALICE Tier2 Center in KISTI
� February, 2007� Approved as an ALICE-LCG site
� October, 2007� WLCG MoU signed between KICOS and CERN� KISTI becoming an official Tier2 site for ALICE experiment
� July, 2008� International Summer School on Grid computing and e-Science
� Tutorial on ALICE computing (AliEn, ROOT, PROOF)
� October, 2009� KISTI developers visited CERN ALICE Computing Group to extend our collaboration to include development work for ALICE computing framework
� Since then, 1FTE from KISTI has been allocated to the joint development of PROOF (Parallel ROOt Facility)
Major Changes to KISTI ALICE T2 in 2009Major Changes to KISTI ALICE T2 in 2009
� Some major changes in Operation
� CREAM-CE was introduced in the early 2009
� DPM was replaced with the xrootd in September 2009
� Both VOBOX and WNs were migrated to SL5-64bit in
October 2009
� Not in good shape in the year of 2009
� Some problems with Cream CE & xrootd configurations
KISTIKISTI’’ss contribution to ALICE computingcontribution to ALICE computing
� Currently, 128 CPU cores and 30 TB storage dedicated to the ALICE distributed computing Grid
� About 1.2% contribution to ALICE computing in the total job execution
� Processing near 8000 jobs per month in average
1.21 % 1.2 %
FKPPL VO: Production Grid FKPPL VO: Production Grid
InfrastructureInfrastructure
FKPPL VO Grid built based on FKPPL VO Grid built based on gLitegLite
� Background� Collaborative work between KISTI in Korea and CC-IN2P3 in France in the area of Grid computing
� Objective� Foster the adoption of grid technology and provide researchers in Korea and France with production Grid Infrastructure
� Operation� Has been up and running since October 2008, providing about 7,000 CPU cores and 2 TBytes of disk storage
� As of now, 50 users have joined the FKPPL VO membership
VOMS
WMSCE
SE
UI
CE
SEKISTI
IN2P3LFC WIKI
Application porting Support on FKPPL VO Application porting Support on FKPPL VO
� Deployment of Geant4 applications
� Used extensively by the National Cancel Center in Korea to carry
out compute-intensive simulations relevant to cancer treatment
planning
� In collaboration with Dr. Jungwook Shin and Se Byeong Lee of
National Cancer Center in Korea
� Deployment of two-color QCD (Quantum
ChromoDynamics) simulations in theoretical Physics
� Several hundreds or thousands of QCD jobs are required to be run
on the Grid, with each jobs taking about 10 days.
� In collaboration with Prof. Seyong Kim of Sejong University
Training Activities on top of FKPPL VOTraining Activities on top of FKPPL VO
� In February, we organized Geant4 and Grid tutorial 2010 for Korean medical physics community� About 30 participants from major hospitals in Korea
� About 20 Grid Certificates issued and joined the FKPPL VO membership
10
Computing Element Storage Element
IN2P3
(Re)Submit QCD job(executable + small inputs)
Submit QCD job
Retrieve output
WMS
Su2.x
Retrieve status & (small) output files
Input file(s)
Output file(s)
ApplicationApplication--level level checkpointingcheckpointing/restarting scheme /restarting scheme
developed for the QCD simulation on the Griddeveloped for the QCD simulation on the Grid
Heartbeat
Monitor
Check status &IntermediateResult
send intermediateresult
CheckpointCheckpointServerServer
Retrieve the latestIntermediateresult
Grid Development Activities in Grid Development Activities in
collaboration with EGEEcollaboration with EGEE
AMGA AMGA
� An official EGEE gLite middleware component for a metadata catalogue
service on the grid
� AMGA provides:
� Access to metadata for files distributed on the grid
� A simplified general access to relational data stored in database systems
� KISTI has taken over the leadership of AMGA development since the July of
2009
� AMGA 2.0 supporting the OGF WS-DAIR was successfully released in October in 2009 in
collaboration with CERN and INFN
� Currently, KISTI is one of the partners of open glite collaboration, and working
with EMI, contributing to the evolution and maintenance of AMGA
High Energy Physics
Digital LibraryClimate Research
Drug Discovery
High Energy Physics
The Use of AMGA for BelleThe Use of AMGA for Belle--II Metadata ServiceII Metadata Service
BelleBelle--II Metadata Service ScenarioII Metadata Service Scenario
LV2 AMGA
server 1
s2
s1
s0
R s9
LV1 AMGA server 1
LV1 AMGA server 2
LV1 AMGA server 3
LV1 AMGA server 4
R s0 s1 s2
s2 s3 s4 s7
s5 s6 s7 s3
s8 s9 s5 s6
s2
s1
s0
R
local LV2
AMGA server1
local LV2
AMGA server2
Tier 0 : KEK
Tier 1 : KISTI
s1
Personal AMGAWeb Service
Metadata download
Metadata publish
metadata accessmetadata replicationmetadata redirection
GangaGanga
� easy-to-use user interface for job submission and management to� Specification, submission, bookkeeping, and post processing of computational tasks on a wide set of distributed resources
� Provides a homogeneous environment for processing data on heterogeneous resources
Grid
Localhost
Batch
G
a
n
g
a
LCG/gLite
Backend
PBS or SGE
Backend
Local
Backend
LCG/gLite
Cmd or Lib
PBS or SGE
Cmd or Lib
Local
Cmd or Lib
Athena
GAUDI
ROOT
Executable
The Development of The Development of GangaKISTIGangaKISTI Modules in Collaboration Modules in Collaboration
with the CERN with the CERN GangaGanga Team Team
� GangaKISTI Modules developed and integrated into the official release of Ganga 5.3.2 in July, 2009 � GridWay backend
� InterGrid backend
BioMed
PBS or
SGE
PRAGMA
G
a
n
g
a
InterGrid
Backend
PBS or SGE
Backend
Local
Backend
PBS or SGE
Cmd or Lib
Local
Cmd or Lib
GridWay
Backend
GridWay
Cmd or Lib
LCG/gLite
Backend
LCG/gLite
Cmd or Lib
Athena
GAUDI
ROOT
AutoDockFKPPL
The WISDOM ProjectThe WISDOM Project
� International initiative to deploy large-scale in-silicodocking on a public grid infrastructure
� An attempt to find potential drugs against neglected or emerging diseases � e.g., Malaria, Avian Flu
� KISTI has been involved in the world-wider data challenge of the WISDOM project
16.3GBData size produced
3000Average number of CPU used
3.2 daysExecution Time
22.4CPU years
308,307Number of Ligands docked
EGEE Biomed VO Grid Infrastructure used
Data Challenge against Diabetes Type II in 2009Data Challenge against Diabetes Type II in 2009
� Conducted a Data challenge to maltase, a target
protein of diabetes type 2, with 308307 chemical
compounds
Concluding RemarksConcluding Remarks
� KISTI has participated in the EGEE project since 2006
� As the EGEE project is coming to an end, with the last four years of our participation to it, we have gained a lot of experiences in many areas � Production Grid Operation
� ALICE Tier2
� Application Porting on the Grid
� Geant4 and QCD simulation
� High-level Grid tools
� Ganga
� Grid core middleware services
� AMGA
� Even, production grid infrastructure of our own
� FKPPL VO
Thank you for your attention!Thank you for your attention!
Top Related