ERPM DOC
-
Upload
naveen-krishna -
Category
Documents
-
view
215 -
download
0
Transcript of ERPM DOC
-
7/31/2019 ERPM DOC
1/96
A Project Report On
ENTERPRISE RESOURCE PROJECT MANAGEMENT
Submitted in the partial fulfillment of the requirementsfor the award of the degree in
BACHELOR OF TECHNOLOGY
By
Under the Guidance of
-
7/31/2019 ERPM DOC
2/96
INDEX:
Chapter 1. INTRODUCTION
Organization Profile 01
ABSTRACT 02
Chapter 2. SYSTEM ANALYSIS
Requirement Analysis 03
Software Requirement Specification 04
Feasibility Study 05
Economic Feasibility
Technical Feasibility
Behavior Feasibility
Operational Feasibility
-
7/31/2019 ERPM DOC
3/96
Existing System 06
Problem Statement 06
Proposed System 06
Hardware & Software Specification 07
Chapter 3. SYSTEM DESIGN
3.1 Introduction 08
Project Modules 09
Dictionary 09
Data Flow Diagrams 10
UML Diagrams 13
Sequence Diagrams 14
LIFE CYCLE MODELS 15
Chapter 4. SYSTEM ENVIRONMENT 18
Chapter 5. SYSTEM TESTING 31
Chapter 6. SCREENS 36
Chapter7. CODING 62
Chapter 8. CONCLUSION 78
8.1 BIBLOGRAPHY 79
-
7/31/2019 ERPM DOC
4/96
-
7/31/2019 ERPM DOC
5/96
ABSTRACT:
Enterprise Resource Planning Management is online system with Personal and
general administration activities fully automated, like Recruitment, Employeeestablishment and personal information, Medical Reimbursement, Leave and
Attendance, Payroll, Training etc... The existing RECRUITMENT SYSTEM in Nagarjuna
Group is currently being used in FoxPro.
EXISTING SYSTEM:
-
7/31/2019 ERPM DOC
6/96
No proper dynamic search method is available to immediately access a particular
record. Fast retrieval of required data is not possible thus causing delay and unnecessary
search of the entire list.
FoxPro under Novel NetWare version is not a graphical user interface based
application. User interaction with the system is minimized because of the DOS
environment, unlike the windows environment where the user interaction with the system
is high.
Handling of large databases effectively is not possible with the above software.
Creating dynamic queries is difficult in FoxPro, So dynamic report generation is
not possible.
Security feature which is very important aspect of NFCL is already existing but
needs to be enhanced and need to be foolproof.
Online reports and graphical representation of reports do not exist.
PROPOSED SYSTEM:
Keeping in view of growth that has been envisaged, it may not be practical andeconomical to continue with the current system. To facilitate a more efficient Recruitment
System and to increase the responsiveness, it is necessary to have better Recruitment
System integrated with the enterprises Information System.
SYSTEM OBJECTIVES:
To automate selection process. To facilitate high graphical user interface to the
user. To provide better functioning and accurate information in time. To provide
damaintenance features. To improve the efficiency and to reduce the overload of work.
To generate appropriate and concerned information to the user using dynamic
queries. To generate appropriate reports. To provide security.
-
7/31/2019 ERPM DOC
7/96
SOFTWARE METHODOLOGY:
The software methodology followed in this project includes the object-oriented
methodology and the application system development methodologies. The description of
these methodologies is given below.
APPLICATION SYSTEM DEVELOPMENT - A LIFE CYCLE APPROACH :
Although there are a growing number of applications (such as decision support
systems) that should be developed using an experimental process strategy such as
prototyping, a significant amount of new development work continue to involve majoroperational applications of broad scope. The application systems are large highly
structured . User task comprehension and developer task proficiency is usually high.
These factors suggest a linear or iterative assurance strategy. The most common method
for this stage class of problems is a system development life cycle model in which each
stage of development is well defined and has straightforward requirements for
deliverables, feedback and sign off . The system development life cycle is described in
detail since it continues to be an appropriate methodology for a significant part of new
development work.
The basic idea of the system development life cycle is that there is a well-defined
process by which an application is conceived and developed and implemented. The life
cycle gives structure to a creative process. In order to manage and control the
development effort, it is necessary to know what should have been done, what has been
done, and what has yet to be accomplished. The phrases in the system development life
cycle provide a basis for management and control because they define segments of the
flow of work, which can be identified for managerial purposes and specifies the
documents or other deliverables to be produced in each phase.
The phases in the life cycle for information system development are described
differently by different writers, but the differences are primarily in the amount of necessity
and manner of categorization. There is a general agreement on the flow of development
-
7/31/2019 ERPM DOC
8/96
steps and the necessity for control procedures at each stage.
The information system development cycle for an application consists of three
major stages :
1) Definition.
2) Development.
3) Installation and operation.
The first stage of the process, which defines the information requirements for a
feasible cost effective system. The requirements are then translated into a physical
system of forms, procedures, programs etc., by the system design, computer
programming and procedure development. The resulting system is test and put into
operation. No system is perfect so there is always a need for maintenance changes. To
complete the cycle, there should be a post audit of the system to evaluate how well it
performs and how well it meets the cost and performance specifications. The stages of
definition, development and installation and operation can therefore be divided into
smaller steps or phrases as follows :
DEFINITION:
Proposed definition: Preparation of request for proposed applications.
Feasibility assessment : Evaluation of feasibility and cost benefit of
proposed system .
Information requirement analysis : determination of information needed.
-
7/31/2019 ERPM DOC
9/96
DESIGN:
Conceptual design : User-oriented design of application development.
Physical system design : Detailed design of flows and processes in
applications processing system and preparation of
program specification.
Development :
Program development : coding and testing of computer programs.
Procedure development : design of procedures and preparation of user
instructions.
Installation and operation :
Conversion : Final system test and conversion.
Operation and maintenance : Month to month operation and maintenance Post
audit : Evaluation of development
Process , application system and results of use
at
the completion of the each phase, formal
approval
sign-off is required from the users as well at from
the manager of the project development.
SOFTWARE PROCESS MODELS:
To solve actual problems in industry setting a software engineer or a team of
engineers must incorporate a development strategy that encompasses the process
methods and tool layers. This strategy often refereed as a process model or software
engineering paradigm.
-
7/31/2019 ERPM DOC
10/96
A process model for software is chosen based on the nature of the project and
application, the methods and tools to be used and the controls and deliverables that are
required. .
All software development can be characterized as a problem-solving loop in which
four distinct stages are encountered:
1. Status quo '
2. Problem definition
3. Technical development
4. Solution integration
Status quo : Represents the current state of affairs.
Problem definition : Identifies the specific problem to be solved.
Technical development : Application of the some technology.
Solution integration : Delivers the result to those who requested
the solution .
The various models are
1. Linear sequential model .
2. Prototype model .
3. The RAD model .
4. Evolutionary software model .5. Formal methods model .
6. Fourth generation techniques .
A MODEL OF THE PROTOTYPING PROCESS:
Prototyping an application system is basically a four-step process as described
bellow. There are two significant roles the user and system designer roles.
Stepl: Identify the users basic information requirements.
-
7/31/2019 ERPM DOC
11/96
In this stage the user particulates his or her basic needs in terms of output from
the system. The designers responsibility to establish realistic user expectations and to
estimate the cost of developing an operational prototype. The date elements are defined
and their availability determined.
Step2: Developing the initial developing system.
The objective of this step is to build a functional interactive application system that
meets the users basic stated information requirements. The system designer has the
responsibility for building the system using very high development tools. The early
prototype is delivered to the user to assess the capability and further development.
Step3: Use of the prototype system to define the user requirements.
This step allows users to gain hands-on experience with the system in order to
understand his/her information needs and what the system does and does not do to meet
those needs. The user rather than the designer decide when changes are necessary andthus controls the overall development time.
Fourth generation techniques:
The term "fourth generation techniques" encompasses broad array of tools that
have one thing in common. Each enables the software engineer to specify some
characteristics of software at high level. The tool then automatically generates source
based on developer specification. It focus on the ability to software using specialized
language forms or a graphic notion that describes the problem to be solved in terms of
that the customer can understand.
The current 4GT tools: nonprocedural languages for database query report
generation, data manipulation, screen interaction and definition code generation, and
-
7/31/2019 ERPM DOC
12/96
high-levelgraphics capability.
The current states of 4Gt approaches are:
The use of 4Gt has broadened considerable over the past decade and is now a
variable approach for many different application areas. 4GT offer solutions to various
problems by using computer aided software engineering tool and code generators.
Data collected from the companies who are using 4GT indicates the time required
to produce software is greatly reduced for small and intermediate applications.
A lot of analysis is performed to obtain a time saving that can be achieved through
elimination of coding.
SCHEDULE
Study phase:
The study phase is the phase during which identified, alternative solutions are
studied and other recommendations are made about committing the personnel, money
other resources required to design this system. The activities in this phase include the
investigation of the problem, the determination of the desired system performance, the
identification and evaluation of activities is it selects the most cost-effective system. A
study phase report is prepared and this system is recommended to the user or users ofthe system as most feasible solution to the problem.
The first step in this phase is the problem identification . The second step is
performance definition. This means determining what the usable outputs of the system
may be. The third step in selecting a system is to identify possible system that might
solve the problem and to select one of these. We call the possible solutions as
alternatives and we call the process of selecting the most cost-effective alternative as a
feasible solution.
In order to perform these above alternatives we need at least 15 days. The study
of existing system took three days. The study of proposed system took another three
days. Then the various alternatives to prepare the proposed system took another three
days. Then the various alternatives available to prepare the proposed system done in two
days and the best of these alternatives have been selected for implementation. The data
-
7/31/2019 ERPM DOC
13/96
flow diagrams are prepared with in the few days and the final report of the study phase is
prepared in three days and submitted.
Design phase:
The detailed design of the system selected in the study phase takes place
during the design phase. System design starts by reviewing the study phase activities
and making final decisions about which functions are to be performed by hardware,
software or humans. In this phase the output, input and the data base storage designs
are completed for each of the computer programs. The design phase recommendations
are presented to the user in a report.
To perform these activities the expected time duration is again fifteen to twenty
days. The logical design shows how the system meets the requirements. This may take a
minimum of ten days. Here the data stores, data sources etc should be identified, and the
physical design showing the development of actual program software may take another
ten days.
Development phase :
In the development phase, the system is constructed to fulfill the requirements
outlined in the design phase. Development phase activities include preparing manuals
and training employees, writing and testing computer programs are a part. At the
-
7/31/2019 ERPM DOC
14/96
conclusion of the development phase, the system is ready to be put into use.
This phase concludes with a presentation of the complete system for acceptance
by the user, at management review meeting. The whole of the development may take
one month. To actually write the chosen language, which takes at least ten days,
preparing manuals etc, take five to eight days, finally the testing of the computer
programs may take ten more days.
Operation phase:
The operation phase is period during which the system is used. Activities include
changing over the new system, monitoring the system's performance, and establishing
procedures for making modifications or changes in the system. This phase continues for
the rest of system's usefulness life. The implementation and evaluation may take three
months at least.
MODULEDESCRIPITION:
In this project ERPM we concentrated all the problems what ever we discussed in
the above existing system. We are providing solution to the management with respect to
time and cost value. We are providing different authentication and authorization facility
-
7/31/2019 ERPM DOC
15/96
for each management levels.
THIS PROJECT IS DIVIDED INTO THREE MODULES.
3 MODULES:
1) REGISTRATION
2) JUNIOR LEVEL MANAGEMENT
3) SENIOR LEVEL MANAGEMENT
REGISTRATION MODULE:
In organization they will get applicant details from different sources, such as by
post by email, and by hand. This registration module deal the procedure to store these
applicant details in database .In this module we can generate two reports one for
applicant details and second one for skill set details.
Senior level management uses these reports
JUNIOR LEVEL MANAGEMENT:
In this module junior level management conduct the different tests for the
applicants, after technical written test, these tests are psychological tests organizational
awareness tests and so on...
This module we can generate reports related to test results junior level
Management allows the applicant for the next level.
Finally they will assign the applications to the interviewers for interview.
SENIOR LEVEL MANAGEMENT:
In this module senior level management interact with the database for applicant
details. They will go through the applicant details and their Skill-sets reports. They will
select the applicants for technical written test and they will
-
7/31/2019 ERPM DOC
16/96
Send intimation letters to applicants.
Again this management deals with the applicants after Jr level
management selection process. After interviews they will get the details of selected
candidates in the Interview with these details they will prepare the final candidates list for
Appointment and they will send the appointment letters to the selected applicants.
DATA DICTONARY
The logical characteristics of current systems data stores, including name,
description, aliases, contents, and organization, identifies processes where the
data are used and where immediate access to information required, Serves as the
-
7/31/2019 ERPM DOC
17/96
basis for identifying database requirements during system design.
Uses of Data Dictionary:
1. To manage the details in large systems.
2. To communicate a common meaning for all system elements.
3. To Document the features of the system.
4. To facilitate analysis of the details in order to evaluate characteristics and
determine where system changes should be made.
LOGIN
userid varchar2(15) Primary keypassword varchar2(15)
PERSONAL_INFO2
userid varchar2(15) references login(userid),name varchar2(25) not nullfathername varchar2(25) not nullotherdependent varchar2(25)native varchar2(15)healthhistory varchar2(15)religion varchar2(15)lactivity varchar2(20)caste varchar2 (10) not nullpreferred varchar2(10) not nullheight number (5,2)
weight number(5,2)vision varchar2(10)maritalstatus varchar2(10)
-
7/31/2019 ERPM DOC
18/96
EDUCATION1
userid varchar(15) references login(userid)name varchar2(20)
qualcode varchar2(20 ) not nullpassedyear date not null division
varchar2(30)insadd varchar2(30)city varchar2(15)percent number(10) not null
subject varchar2(10)awards varchar2(20)
ADDRESS_INFO
userid varchar2(15) references login(userid)sname varchar2(25)permanent_address varchar2(50)phonenumber number(20)fax number(15)emailid varchar2(20) not nullphonenumber2 number(20)permanentcity varchar2(20)
presentaddress varchar2(50)presentcity varchar2(15)
MARKS1userid varchar2(15) references login(userid)name varchar2(20)marks number(10) not null
JUNIOR1
userid varchar2(20) references login(userid)int date datesname varchar2(20)pcode varchar2(20)ivname varchar(20)motivationrem varchar2(10)
-
7/31/2019 ERPM DOC
19/96
planningrem varchar2(10)teamrem varchar2(10)objrem varchar2(10)
jknowrem varchar2(10)comrem varchar2(10)
probsolvrem varchar2(10)asserrem varchar2(10)selfdevrem varchar2(10)result varchar2(10)
EXPERIENCE1userid varchar2(15) references login(userid)sname varchar2(20)startdate dateenddate dateorganisationaddress varchar2(50)
designation varchar2(20)city varchar2(20)remuneration number(10)
-
7/31/2019 ERPM DOC
20/96
HIREARCHY CHART:
Application Assessment
Written test Assessment
Interview Assessment
Reports
Reports
Lowlevel
Management
Highlevel
Management
Reports
Reports
-
7/31/2019 ERPM DOC
21/96
Requisition
of Manpower
Check
whether inline
with approved
Org. Structure
O
k
Replay To
Indenting
dept
Sourcing
Consult
Response
to
Advertis
ement
Proactive
&
Referred
Manua
l
security
Initial
Interview
Data bank
Rejected
ApplicationB
Assessment
-
7/31/2019 ERPM DOC
22/96
DATAFLOW DIAGRAMS(DFD)
B
A
Rejected Intimate Candidate
Short Listed Final Interview Salary Fixation
Reference Check
Approval from
Group Chairman
Medi
cal
Test
RejectAppointment
Order
Intimation to Induction department
Joining formalities
Employee
Database
-
7/31/2019 ERPM DOC
23/96
The data flow diagram is used for classifying system requirements to major
transformation that will become programs in system design. This is starting point
of the design phase that functionally decomposes the required specifications
down to the lower level of details. It consists of a series of bubbles joined
together by lines.
Bubbles: Represent the data transformations.
Lines: Represents the logic flow of data.
Data can trigger events and can be processed to useful information.
System analysis recognizes the central goal of data in organizations. This
dataflow analysis tells a great deal about organization objectives are
accomplished.
Dataflow analysis studies the use of data in each activity. It
documents this finding in DFDs. Dataflow analysis give the activities of a system
from the viewpoint of data where it originates how they are used or hanged or
where they go, including the stops along the way from their destination. The
components of dataflow strategy span both requirements determination and
systems design. The first part is called dataflow analysis.
As the name suggests, we didnt use the dataflow analysis tools
exclusively for the analysis stage but also in the designing phase with
documentation.
-
7/31/2019 ERPM DOC
24/96
NOTATIONS USED IN DATA FLOW DIAGRAMS
The logic dataflow diagrams can be drawn using only four simple notations i.e.,
special symbols or icons and the annotation that associates them with a specific
system. Since the choice of notation we follow, does not affect impede or catalyze the
system process; we used three symbols from YOURDON notation and one from Gain
and Sarson notation as specified below.
Element References symbols
Data Flow Process
Process
Data Store
Source or Sink
Description:
Process: describes how input data is converted to output
Data
Data Store: Describes the repositories of data in a system
Data Flow: Describes the data flowing between process, Data
stores and external entities.Sources: An external entity causing the origin of data.
Sink: An external entity, which consumes the data.
-
7/31/2019 ERPM DOC
25/96
Context Diagram:
The top-level diagram is often called a context diagram. It
contains a single process, but it plays a very important role in studying the
current system. The context diagram defines the system that will be
studied in the sense that it determines the boundaries. Anything that is
not inside the process identified in the context diagram will not be part of
the system study. It represents the entire software element as a single
bubble with input and output data indicated by incoming and outgoing
arrows respectively.
TYPES OF DATAFLOW DIAGRAMS:
DFDs are two types:
1. PHYSICAL DFD
Structured analysis states that the current system should be
first understand correctly. The physical DFD is the model of the current
system and is used to ensure that the current system has been clearly
understood. Physical DFDs shows actual devices, departments, people
etc., involved in the current system
2.LOGICAL DFD
Logical DFDs are the model of the proposed system. They
clearly should show the requirements on which the new system
should be built. Later during design activity this is taken as the
basis for drawing the systems structure charts.
-
7/31/2019 ERPM DOC
26/96
CONTEXT DFD:
A
REQURIMENTSYSTEM
WRITTEN TEST DETAILS
WRITTEN TEST MARKS
INTERVIEW
DETAILS
APPLICATION FORM
APPLICANT
INTERVIEW
ASSESMENT
APPLICANT
-
7/31/2019 ERPM DOC
27/96
FIRST LEVEL DFD
ADDRESS DETAILS
PERSONAL DETAILS
EDUCATION DETAILSREFRENCE DETAILS
EXPERINC DETAILS
LANGUAGE KNOWS
TRAINING
PROFESSIONAL BODIES
INTERVIEASSESSMENT
FINAL SELECTIONApplication Form Second level
ADDRESS DETAILS
PROCESS OF
APPLICATION
FORM
WRITTEN TEST
MARKS
WRITTEN TESTDETAILS
PROCESS
OF
WRITTENTEST
PROCESS OF
INTERVIEW
ASSESSMENT
ENTER
APPLICATION
FORM
PERSONAL DETAILS
-
7/31/2019 ERPM DOC
28/96
EDUCATION DETAILSLANGUAGE KNOWS
PROFESSIONAL BODIES PERSONAL DETAILS
INTERVIEW DETAILS
WRITTEN TEST DETAILS
CALL LETTERS FORINTERVIEWS
CALL LETTERS FORWRITTEN TEST
MAILING LABLES SELECTION LIST
INTERVIEW ASSESSMENT2ND LEVEL DFD
INTERVIEW DETAILSQUALITY
RELATIVE DETAILS
INTERVIEWASSESSMENT
PROCESS
APPLICATION
FORM
GENERATE
REPORTS
ENTER
DETAILS
GENERATE
REPORTS
-
7/31/2019 ERPM DOC
29/96
INTIMATION LETTERS
MAILING LABLES OVERAL PERFORMANCE REPORT
SELECTION LIST
PROCESS
DETAILS
-
7/31/2019 ERPM DOC
30/96
ENTITY-RELATIONSHIP DIAGRAMS
E-R (Entity-Relationship) Diagram is used to represents the relationship
between entities in the table.
THE SYMBOLS USED IN E-R DIAGRAMS ARE:
SYMBOL PURPOSE
Represents Entity sets.
Represent attributes.
Represent Relationship Sets.
Line represents flow
-
7/31/2019 ERPM DOC
31/96
ADDRESS_DETAILS EXPERIENCE
SNO
SNO
SNAME
PERMMANENT
_ADDR
PHONE
EMAIL
CITY PRESENT_ADDR
START
DATE
SNOSNO
SNAME
ENDDATE
DESIGNA
TION
CITY
-
7/31/2019 ERPM DOC
32/96
PERSONAL_INFO
EDUCATIONSNO
SNO
NAME
NATIVE
EECASTE
MARTIAL
FATHER
NAME
HEIGHT
WEIGHT
VISION DIVISION
INSTADD
NAME
QUAL
CODE
CITY
PASSEDY
EAR
-
7/31/2019 ERPM DOC
33/96
PERSONAL_INFO MARKS
SNO
SNO
NAMEFATHER
NATIVE
RELIGION
CASTE HEIGHT
WEIGHT
RESULTS
SNOSNO
IMARKS
-
7/31/2019 ERPM DOC
34/96
JUNIOR MARKSSNO
SNO
POSITIONSNAME
PLAN
INTER
VIEWR
ACHIVE
MENT
JOBKNOW
LEDGE
RESULTS
SNOPHONE
IMARKS
-
7/31/2019 ERPM DOC
35/96
PERSONAL_INFO ADDRESS_DETAILS
SNO
SNO
NAME
FATHER
NATIVE
RELIGION
CASTE HEIGHT
WEIGHT
PERMANEN
T ADDD
SNOSNO
SNAME
PHONE
IMARKS
Reports
-
7/31/2019 ERPM DOC
36/96
SOFTWARE AND HARDWARE REQUIREMENTS
HARDWARE SPECIFICATION:
Processor : Intel P-IV based system
Processor Speed : 250 MHz to 833MHz
RAM : 256MB to 512MB
Hard Disk : 2GB to 30GB
SOFTWARE SPECIFICATION:
Language : C#.NET
Database : SQL SERVEROperating System : Windows2000
Technologies : ASP.NET, ADO.NET
Web/Application server : Internet Information services (IIS)
-
7/31/2019 ERPM DOC
37/96
-
7/31/2019 ERPM DOC
38/96
Microsoft .NET Framework
The .Net frame work is a new computing platform that simplifies application
development in the highly distributed environment of the internet. The .NET frame work
is designed to fulfill the following objectives:
To provide a consistent object-oriented programming environment whether
object code is stored and executed locally, executed locally but internet
distributed, or executed remotely.
To provide a code-execution environment that minimizes software
deployment and versioning conflicts.
To provide a code-execution environment that guarantees safe execution of
code, including code created by an unknown or semi-trusted party.
To provide a code-execution environment that eliminates the performance
problems of scripted or interpreted environments.
To make the developer experience consistent across widely varying types of
applications, such as Windows-based applications and Web-based
applications.
To build all communications on industry standards to ensure that code based
on the .NET Framework can integrate with any other code.
The .NET Framework has two main components:
o The common language runtime.
o The .NET Framework class library.
The common language runtime is the foundation of the .NET Framework. You can thinkof the runtime as an agent that manages code at execution time, providing core
services such as memory management, thread management, and remoting, while also
enforcing strict type safety and other forms of code accuracy that ensure security and
robustness. In fact, the concept of code management is a fundamental principle of
runtime. Code that targets the runtime is known as managed code, while code that does
-
7/31/2019 ERPM DOC
39/96
not target the runtime is known as managed code. The class library, the other main
component of the.NET Framework, is a comprehensive, object-oriented collection of
reusable types that you can use to
18
develop application ranging from traditional command line or graphical user interface
(GUI) applications to applications to latest innovations provide by Windows applications.
The .net Framework can be hosted by unmanaged components that load the
common language runtime into their processes and initiate the execution of managed
code, thereby creating a software environment that can be exploit both managed and
unmanaged features. The .net Framework not only provides several runtime hosts, but
also supports the development of third-party runtime hosts.For example, ASP .NET hosts the runtime to provide a scalable, server-side
environment for managed code. ASP .NET works directly with the runtime enable Web
forms applications and XML Web services, both of which are discussed later in this
topic.
Internet explorer is an example of an unmanaged application that hosts the
runtime. Using Internet Explorer to host the runtime enables you to embed managed
components or windows forms controls in HTML documents. Hosting the runtime in this
way makes managed mobile code possible, but with significant improvement that only
managed code can offer, such as semi-trusted execution and secure isolated file
storage.
The following illustration shows the relationship of the common language
runtime and the class library to your application and to the overall system. The
illustration also shows how managed code operates within a larger architecture.
Features of the common Language Runtime:
The common language runtime manages memory, thread execution, code
execution, code safety verification, compilation, and other system services. These
features are intrinsic tothe managed code that runs on the common runtime. Language
-
7/31/2019 ERPM DOC
40/96
compilers that target the .NET Framework makes the features of the .NET Framework
available to existing code written in that language, greatly easing the migration process
for existing application.
19
While the runtime is designed for the software of the future, it also supports
of today and yesterday. Interoperability between managed and unmanaged code
enables developers to use necessary COM components and DLLs.
The runtime is designed to enhance the performance. Although the common
language runtime provides many standards runtime service, managed code is never
interpreted. A feature called just-in-time (JIT) compiling enables all managed code to
run in the native machine language of the system on which it is executing. Meanwhile,the memory manager removes the possibilities of fragmented memory and increases
memory locality-of-reference to further increase performance.
Finally the run time can be hosted by high performance, servers side
applications such as Microsoft SQL Server and internet information services (IIS). This
infrastructure enables you to use managed code to write your business logic, while still
enjoying the superior performance of the industrys best enterprise servers that supports
runtime hosting.
.NET Framework class library:
The .NET Framework class library is a collection of usable types that
integrate with the common language runtime the class library is object oriented
providing types from which your managed codes can derive functionality. This not only
makes the .NET Framework types easy to use, but also reduces the time associated
with learning new features of .NET Framework. In addition, third-party components can
integrate seamlessly with classes in the .NET Framework.
For example, the .NET Framework collection classes implement a set of
interfaces that you can use to develop your own collection classes. Your collection
classes will blend seamlessly with the classes in the .NET Framework.
As you would expect from an object-oriented class library, the .NET Framework types
-
7/31/2019 ERPM DOC
41/96
enable you to accomplish a range of common programming tasks, including tasks such
as string management, data collection, database connectivity, and file access.
20
In addition to these common tasks, the class library type that supports a variety
of specialized development scenarios. For example, you can use the .NET Framework
to develop the following types if applications and services.
1 Console applications
2 scripted or hosted applications
3 windows GUI applications(Windows forms)
4 ASP .NET applications
5 XML Web services6 Windows services.
For example, the windows forms classes are a comprehensive set of
reusable types that vastly simplify Windows GUI development. If you write an ASP .NET
web form application, you can use the web forms classes.
Client Application Development:
Client applications are the closest to a traditional style of application in
windows based programming. These are the types of applications that display windows
or forms on the desktop, enabling a user to perform a task. Client applications include
applications such as word processors and spreadsheets, as well as custom business
applications such as data-entry tools, reporting tools, and so on. Client applications
usually employ windows, menus, buttons, and other GUI elements, and they likely
access local resources such as the file system and peripherals such as printers.
Another kind of client applications is the traditional ActiveX control (now
replaced by the managed Windows Forms control) deployed over the internet as a web
page. This application is much like other client applications: it is executed natively, has
-
7/31/2019 ERPM DOC
42/96
access to local resources, and includes graphical elements.
ACTIVEX DATA OBJECTS
ADO .NET Overview
ADO .NET is an evolution of the ADO data access model that directly addresses user
requirements for developing scalable applications. It was designed specifically for the
web with scalability, statelessness, and XML in mind.
ADO .NET uses some ADO objects, such as the Connection andCommand objects, and also introduces new objects. Key new ADO .NET objects
include the Data Set, Data Reader, and Data Adapter.
The important distinction between this evolved stage of ADO .NET and
previous data architectures is that there exists an object--the Dataset--that is separated
and distinct from any data stores. Because of that, the Dataset functions as a
-
7/31/2019 ERPM DOC
43/96
standalone entity. You can think of the Dataset as an always disconnected record set
that knows nothing about the source or destination of the data it contains. Inside a
Dataset, much like in a database, there are tables, columns, relationships, constraints,
views, and so forth.
A Data Adapter is the object that connects to the database to fill the
Dataset. Then, it connects back to the database to update the data there, based in
operations performed while the Dataset held the data.
In the past, data processing has been primarily connection based. Now, in
an effort to make multi tired apps more efficient, data processing is truing to a message
based approach that revolves around chunks of information. At the center of this
approach is the Data Adapter, which provides a bridge to retrieve and save data
between a Dataset and its source data store.
It accomplishes this by means of requests to the appropriate SQL
commands made against the data store.
The XML based Dataset objects provides a consistent programming model
that works with all models of data storage: flat, relational, and hierarchical. It does this
by having no knowledge; of the source of its data, and by representing
the data within the Dataset is, it is manipulated through the same set of standards APIs
exposed through the Dataset and its subordinate objects.
While the Dataset has no knowledge of the source of its data, the
managed provider has detailed and specific information. The role of the managed
provider is to connect, fill, and persists the Dataset to and from data stores. The OLEDB
and SQL server .NET Data Providers (System.Data.OleDb and System.Data.Sqlclient)
that are part of the .NET Framework provider four basic objects: the command,
connection, Data Reader and Data Adapter. In the remaining sections of this document,
well walk through each part of the Dataset and the OLE DB/SQL Server .NET Data
Providers explaining what they are, and how to program against them.
The following sections will introduce you some objects that have evolved,
and some that are new. These objects are:
Connections. For connection to and transaction against database.
-
7/31/2019 ERPM DOC
44/96
Commands. For issuing SQL commands against a database.
Data Readers. For reading a forward-only stream of data records from a SQL
server data source.
Datasets. For storing, removing and programming against flat data, XML data
and relational data.
Data Adapters. For publishing data into a dataset, and reconciling data against a
database.
When dealing with connections to a database, there are two different
options: SQL server .NET Data Provider (System.Data.SqlClient) and OLE DB .NET
Data Provider (System.Data.OleDb). In these samples we will use the SQL Server .NET
Data Provider. These are written to talk directly to Microsoft SQL Server. The OLE
DB .NET Data Provider is used to talk to any OLE DB Provider (as it uses OLE DB
underneath).
Connections
Connections are used to talk to databases, and are represented by provide-specific
classes such SQLConnections. Commands travel over connections and result sets are
returned in the format of streams which can be read by a Data
23
Reader objects, or pushed into a Dataset objects.
Commands
Commands contain the information that us submitted to a database, and are
represented by provider classes such as SQLCommand. A command can be a stored
procedure call, an UPDATYE statement, or a statement that returns results. You canalso use input and output parameters, and return values as part of your command
syntax. The example below shows how to issue an INSERT statement against the North
wide database.
Data Readers
The Data Reader object is somewhat synonymous with a read-only/forward-
-
7/31/2019 ERPM DOC
45/96
only cursor over data. The Data Reader API supports flat as well as hierarchical data. A
Data Reader object is returned after executing a command against a database.
The format of the returned Data Reader object is different from a record set.
For example, you might use the Data Reader to show the results of a search list in a
web page.
Data Sets and Data Adapters
Data sets
The DataSet object is similar to the ADO Record set object, but more
powerful, and with one other important distinction: the DataSet is always disconnected.
The DataSet object represents a cache of data, with database-like structure such as
tables, columns, relationships and constrains. However, through a DataSet can and
does behave much like a database, it is important to remember that dataset objects do
not interact directly with databases, or other source data. This allows the developer to
work with a programming model that is always consistent, regardless of where the
source data resides.
Data coming from a database, an XML file, from code, or user input can all
be placed into Dataset objects. Then, as changes are made to the Dataset they can be
tracked and verified before updating the source data. The Get Changes
24
method of the Dataset object actually creates a second Dataset that contains only the
changes to the data. This Dataset is then used by a DataAdapter to update the original
data source.
The Dataset has many XML characteristics, including the ability to produce
and consume XML data and XML schemas. XML schemas can be used to describe
schemas interchanged via WebServices. In fact, a Dataset with a schema can actually
be compiled for type safety and statement completion.
Data Adapters
The Data Adapter object works as a bridge between the Dataset and the
source data. Using the provider-specific SqlDataAdapter (along with its associated
-
7/31/2019 ERPM DOC
46/96
SqlCommand and SqlConnection) can increase overall performance when working with
a Microsoft SQL Server databases. For other OLE DB-supported databases, you would
use the OleDbDataAdapter object and its associated OleDbCommand and
OleDbConnection objects.
The Data Adapter objects uses commands to update the data source after
changes have been made to the Dataset. Using the fill method of the DataAdapter calls
the SELECT command; using the Update method calls the INSERT, UPDATEOR
DELETE commands for each changed row. You can explicitly set these commands in
order to control the statements used at runtime to resolve changes, including the use of
stored procedures. For ad-hoc scenarios, a CommandBuildr object can generate these
at run-time based upon a select statement.
However this run-time generation requires an extra round trip to the server in the
order to gather required metadata. So explicitly providing the INSERTING,
UPDATEING, DELETEINTG commands at design time will result in better run-time
performance.
ADO.NET was created with n-Tier, stateless and XML in the forefront. Two new
objects, the dataset and data adapter, are provided for these scenarios.
Ado.Net can used to get data from a stream, or store data in catch for updates
There is a lot more information about Ado.Net in the documentation.
Remember, you can execute a command directly against the database in order
to do inserts, update, deletes. You dont needs to first put data into a dataset in
order to insert update, deletes it.
Also, You can use a dataset to bind to the data, move through the data, move
through the data, and navigate data relationships.
-
7/31/2019 ERPM DOC
47/96
SQL SERVER
A data base management, or DBMS, gives the user access to their data helpsthem transform the data into information. Such as management system includes dbase,
paradox, and IMS, SQL and SQL server. These systems allow users to create update
and extract from their database.
A database is a structured collection of data. Data refers to the characteristics of
people, things and events; SQL server stores each data in its own fields. In SQL server
the fields relating to particular person, thing or event are bundled together to form a single
complete unit of data, called a record (it can also be referred to as a row or an
occurrence). Each record is made up of a number of fields .no two fields in a record can
have the same field.
During a SQL server database design project the analysis of your business
needs identifies all the fields or attributes of interest .if your business needs change over
time, you define any additional fields or change the definition of existing fields. SQL server
tables
SQL server stores records relating to each other in a table. Different tables are
created for the various groups of information. Related tables are grouped together to form
a database.
Primary key
Every table in SQL server has a field or a combination of fields that uniquely
identifies each record in table. The unique identifier is called the primary key, or simply the
key. The primary key provides the means to distinguish one record from all other in a
table. It allows the user and the database system to identify, locate and refer to one
particular record in the database.
26
-
7/31/2019 ERPM DOC
48/96
Relational database
Sometimes all the information of interest to business operation can be stored in
one table. SQLServer makes it very easy to link the data in multiple tables. Matching anemployee to the department in which they work is one example. This is what makes SQL
server a relational database management system, or RDBMS. It stores data in two or
more tables and enables you to define relationships between the tables and enables you
to define relationships between the tables.
Foreign Key
When a field is one table matches the primary key of another field is referred
to as a foreign key. A foreign key field or a group of fields in one table whose values match
those of the primary key of another table.
Referential Integrity
Not only does SQL Server allow you to link multiple tables, it also maintains
consistency between them. Ensuring that the data among related tables is correctly
matched is referred to as maintaining referential integrity.
Data Abstraction
A major purpose of database system is to provide users with an abstract
view of the data. This system hides certain details of how data is stored and maintained.
Data abstraction is divided into three levels.
Physical level: This is the lowest level of abstraction at which one describes how the data
are actually stored.
Conceptual Level: At this level of database abstraction all attributed and what data are
actually stored is described and relationship among them
View Level: This is the highest level of abstraction at which one describes only part of the
database.
Advantages of RDBMS
Redundancy can be avoided
Inconsistency can be eliminated
Data can be shared
-
7/31/2019 ERPM DOC
49/96
Standards can be enforced
Security restrictions can be applied
Integrity can be maintained
Conflicting requirements can be balanced
Disadvantages of DBMS
A significant disadvantage of the DBMS System is cost. In addition to the cost of
purchasing of developing the software, the hardware has to be upgrade to allow for the
extensive programs and the workspace required for their execution and storage. While
centralization reduces duplication requires that the database be adequately backed up so
that in case of failure the data can be recovered.
Features of SQL SERVER (RDBMS) SQL Server is one of the leading database management system (DBMS)
because it is the only Database that that meets the uncompromising requirements of
todays most demanding information systems. From complex decision support system
(DSS) to the most rigorous online transaction processing (OLTP) application, even
application that require simultaneous DSS and OLTP access to the same critical data,
SQL Server leads the industry in both performance and capability
SQL SERVER is a truly portable, distributed, and open DBMS that delivers
unmatched performance, continuous and support for every database.
SQL SERVER RDBMS is high performance fault tolerant DBMS which is
specially designed for online transaction processing and for handling large database
application.
SQL SERVER with transaction processing option offers two features which
contribute to very high level of transaction processing throughput, which are
The row level lock manager
-
7/31/2019 ERPM DOC
50/96
Enterprise wide Data Sharing
The unrivaled portability and connectivity of the SQL server DBMS enables
all the systems in the organization to be linked into a single, integrated computingresource.
Portability
SQL SERVER is fully portable to more 80 distinct hardware and operating
systems platforms, including UNIX, MSDOS, OS/2, Macintosh and dozens of
proprietary platforms. This portability gives complete freedom to choose the database
server platforms that meet the system requirements.
Open System
SQL SERVER offers a leading implementation of industry-standard SQL.
SQL Servers open architecture integrates SQL SERVER and non-SQL SERVER
DBMS with industrys most comprehensive collection of tools, application, and third
party software products SQL Servers open architecture provides transparent to data
from other relational database and even non-relational database.
Distributed Data Sharing
SQL Servers networking and distributed database capabilities to access
data stored on remote server with the same ease as if the information was stored on a
single local computer. A single SQL statement can access data multiple sites. You can
store data where system requirements such as performance, security or availability
dictate.
Unmatched performance
The most advanced architecture in the industry allows SQL SEVER
DBMS to deliver unmatched performance.
Sophisticated Concurrency Control
Real world applications demand access to critical data. With most
database Systems applications becomes connection bound- which performance
29
-
7/31/2019 ERPM DOC
51/96
is limited not by the CPU power or by disk I/O, but user waiting on one another
contention free queries to minimize and in many cases entirely eliminates contention
waits times.
No I/O Bottlenecks
SQL Servers fast commit groups commit and deferred write
technologies dramatically reduce disk I/O bottlenecks. While some database write
whole data block to disk at commit time, SQL SERVER commits transactions with at
most sequential log file on disk at commit time, On high throughput systems, one
sequential log file on disk at commit multiple transactions. Data read by the transaction
remains as shared memory so that other transactions may access that data without
reading it again from disk. Since fast commits write all data necessary to the recovery to
the log file, modified blocks are written back to the database independently of the
transactions commit, when written from memory to disk.
-
7/31/2019 ERPM DOC
52/96
-
7/31/2019 ERPM DOC
53/96
Testing is the process of confirming that a program or system does what it is
proposed off, Testing is the only way to assure the quality of s/w and it is an umbrella
activity rather that a separate phase. This is an activity to be performed in parallel with
the s/w efforts and one that consists of its own phase of analysis, design,
implementation, execution and maintenance.
Testing Strategy:
Unit Testing:This testing method considers a module as single unit and checks the unit at
interfaces and communities with other modules rather than getting into details at
statement level. Here the module will be treated as BLACKBOX, which will take some
inputs and generate output. Outputs for a given set of input combination are pre
calculated and are generated by the module.
Integration Testing:
Here all the pre-tested individual modules will be assembled to create a
larger system and tests are carried out at system level to make sure that all modules
are working in synchronous with each other. This testing methodology helps in making
sure that all modules which are running perfectly when checked individually and are
also running cohesion with other modules. For this testing we create test-cases to check
all modules once and then a generated test combination of test paths throughout the
system to make sure that no path is making its way into chaos.
Validation Testing:
Testing is major quality control measure employed during software development. Its
basic function is to detect errors. Sub functions when combined may not produce than it
is desired. Global data structures can represent the problems. Integrated testing is a
systematic technique for constructing the program structure while conducting the tests.
To uncover errors that are associated with interfacing the objective is to make test
-
7/31/2019 ERPM DOC
54/96
modules and built a program structure that has detected by design. In a non-
incremental integration all the modules are combined in advance and the program is
tested as a whole. Here error will appear
31
in an end-less loop function.. In incremental testing the program is constructed and
tested in small segments where the errors are isolated and corrected.
Different incremental integration strategies are
3 Top-Down integration
4 Bottom-Up integration
5 Regression integration
Testing means quality test. Testing is a process of executing a program with the
intent of finding error. A good test case is one that has a high probability of finding an as
yet un discovered error.
Objective should be to design test that systematically uncover different classes
of error and to do with a minimum amount of time and effort. Testing cannot show the
absence of defects, it can only show that s\w defects are present. It is important to keep
this statement in a mind as testing is being conducted.
Any engineering product can be tested in one of the two ways.
Knowing the specific function that a product has been designed to perform, test
can be conducted that demonstrates each function is fully operational. This approach is
called BLACK BOX TEXTING
Knowing the internal working of the product, test can be conducted to ensure
that all gears mesh, that is, that internal operation of the product performs according to
specification and all internal components have been adequately exercised. This
approach is called WHITE BOX TESTING.
These approaches provide a mechanism that can help to ensure the
completeness of tests and provide the highest likelihood for uncovering errors in s/w.
The goals of verification and validation are to access and improve the quality of
work products generated during development and modification of s/w. These are 2
types of verification namely.
-
7/31/2019 ERPM DOC
55/96
1. Life-cycle verification
2. Formal verification.
Validation is the process of evaluating s/w at the end of s/w development process.
Quality assurance is a planned and systematic pattern of action necessary to
provide adequate confirms to the technical requirement.
Walkthroughs are sessions where the material being examined is examined is
presented by a review and evaluated by a team of reviewers.
Inspection involves assessing the s/w life cycle and improving the quality of
work products.
Life-cycle verification is the process of determining the degree to which the
work products of a given phase of the development cycle fulfill the specification
established during prior phases.
Formal verification is a rigorous mathematical demonstration that source code
confirms to its specification.
High quality can be achieved through testing of source code alone.
Although a program should be totally free of errors, this seldom the case for large s/w
products. There are 3 major categories of s/w error.
1. Requirement errors
2. Design errors
3. Implementation errors
Quality assurance defines the objective of the project and reviews the overall
activities so that the errors are corrected early in the development process.
-
7/31/2019 ERPM DOC
56/96
During analysis and design, an s/w verification plan and acceptance test plan is
prepared. The verification plan describes the methods to be used in verifying that the
requirements are satisfied by the design documents and that the source is consistent
with the requirements specification and design documents. The acceptance test plan
includes test cases, outcomes and capabilities demonstrated by each test case.
Following completion of the verification plan and
33
acceptance plan, an s/w verification review is held to evaluate the adequacy of the
plans.
During product evolution, in-process audits are conducted to verify consistency
and completeness of the work products. Items to be audited for consistency includeinterface specification for hardware and software and people: internal design verses
functional requirements verses test descriptions.
Prior to product delivery, a functional audit and a physical audit performed. The
functional audit reconfirms that all the requirements have been met. The physical audit
verifies that the source code and all associated documents are complete, consistent
with one another and ready to deliver. An s/w verification summary is prepared to
describe the results of all reviews.
SYSTEM TESTING:
A system is tested for online responses, volume of transactions, stress,
recovery from failure and usability. System testing involves two kinds of activities-
integration testing and acceptance testing.
ACCEPTANCE TESTING:
It involves planning and execution of functional tests and stress tests in order to
demonstrate that the implemented system satisfies its requirements.
-
7/31/2019 ERPM DOC
57/96
Tools to special importance during acceptance testing include:
4. Testing COVERAGE ANALYZER-RECORDS THE CONTROL PATHS
FOLLOWED FOR EACH TEST USER.
5. TRIMING ANALYZER-also calls a profiler, reports the time spent in various
regions of the code are areas to concentrate on to improve system performance.
6. Coding standards-static analyzer and standard checkers are used to insert code
for deviations from standard and guidelines.
ALPHA and BETA TESTING:
If s/w is developed as product to be used by many customers, it is impractical to
perform formal acceptance test with each one. So, one most developers use Alpha and
Beta testing to uncover that only the end user seems able to find.
Alpha testing is conducted by the customer in the presence of many project
leaders and recorded the errors and usage problems what they faced.
Beta testing is conducted at customer site by the end users of the s/w, the
customer recorded that encountered during beta testing and sent those problems to us
regular intervals. Then we made the modification and released to the entire customer
base.
-
7/31/2019 ERPM DOC
58/96
-
7/31/2019 ERPM DOC
59/96
Home page:
-
7/31/2019 ERPM DOC
60/96
Personal Detail login:
-
7/31/2019 ERPM DOC
61/96
Personal registration:
-
7/31/2019 ERPM DOC
62/96
Register Success:
-
7/31/2019 ERPM DOC
63/96
Login form:
-
7/31/2019 ERPM DOC
64/96
Personal information details:
-
7/31/2019 ERPM DOC
65/96
Address details:
-
7/31/2019 ERPM DOC
66/96
Education details:
-
7/31/2019 ERPM DOC
67/96
Experience details:
-
7/31/2019 ERPM DOC
68/96
Exp success:
-
7/31/2019 ERPM DOC
69/96
Interview assignment:
-
7/31/2019 ERPM DOC
70/96
Junior level management:
-
7/31/2019 ERPM DOC
71/96
Junior success:
-
7/31/2019 ERPM DOC
72/96
Senior management:
-
7/31/2019 ERPM DOC
73/96
Selection report:
-
7/31/2019 ERPM DOC
74/96
Selection report:
-
7/31/2019 ERPM DOC
75/96
Selection list from written text:
-
7/31/2019 ERPM DOC
76/96
Intimation report:
-
7/31/2019 ERPM DOC
77/96
Call letter from interview:
-
7/31/2019 ERPM DOC
78/96
Intimation letter:
-
7/31/2019 ERPM DOC
79/96
CODING:
Coding for home page:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;
publicpartialclassHome : System.Web.UI.Page
{ protectedvoid Page_Load(object sender, EventArgs e){
} protectedvoid LinkButton1_Click(object sender, EventArgs e)
{
} protectedvoid LinkButton1_Click1(object sender, EventArgs e)
{Response.Redirect("login.aspx");
}
protectedvoid LinkButton2_Click(object sender, EventArgs e){
}
protectedvoid LinkButton2_Click1(object sender, EventArgs e){
Response.Redirect("interviewassignment.aspx");}}
-
7/31/2019 ERPM DOC
80/96
Coding for login page:
using System;using System.Data;using System.Configuration;
using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclass_Default : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet();
int i; protectedvoid Page_Load(object sender, EventArgs e)
{Label3.Visible = false;
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("registration.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{da = newSqlDataAdapter("select * from registration", con);da.Fill(ds, "registration");
for (i = 0; i < ds.Tables["registration"].Rows.Count; i++){ if (TextBox1.Text == ds.Tables["registration"].Rows[i]["username"].ToString() && TextBox2.Text == ds.Tables["registration"].Rows[i]["password"].ToString())
{Response.Redirect("personalinfo.aspx");
} else
{
-
7/31/2019 ERPM DOC
81/96
Label3.Visible = true;Label3.Text = "invalid username/password";
}}
}}
Coding for registration:
using System;using System.Data;using System.Configuration;
using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclassregistration : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1");
SqlDataAdapter da; DataSet ds = newDataSet(); protectedvoid Page_Load(object sender, EventArgs e)
{
} protectedvoid LinkButton1_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{da = newSqlDataAdapter("select * from registration", con);
SqlCommandBuilder cb = newSqlCommandBuilder(da);
da.Fill(ds, "registration"); DataRow dr;
dr = ds.Tables["registration"].NewRow();dr["username"] = TextBox1.Text;dr["password"] = TextBox2.Text;dr["phoneno"] = TextBox4.Text;dr["gender"] = RadioButtonList1.SelectedItem.ToString();dr["qualification"] = TextBox6.Text;dr["address"] = TextBox7.Text;
-
7/31/2019 ERPM DOC
82/96
dr["emailid"] = TextBox8.Text;dr["city"] = TextBox9.Text;dr["state"] = TextBox10.Text;dr["country"] = TextBox11.Text;ds.Tables["registration"].Rows.Add(dr);da.Update(ds, "registration");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";
TextBox6.Text = "";TextBox7.Text = "";TextBox8.Text = "";TextBox9.Text = "";TextBox10.Text = "";TextBox11.Text = "";
Response.Write("alert('successfully registred')");
}}
Coding for personalinformation:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclasspersonalinfo : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet(); int i; protectedvoid Page_Load(object sender, EventArgs e)
{
-
7/31/2019 ERPM DOC
83/96
count();}
protectedvoid Button1_Click(object sender, EventArgs e){
da = newSqlDataAdapter("select * from personalinfo", con); SqlCommandBuilder cb = newSqlCommandBuilder(da);
da.Fill(ds, "personalinfo"); DataRow dr;
dr = ds.Tables["personalinfo"].NewRow();dr["sno"] = TextBox1.Text;dr["fathername"] = TextBox2.Text;dr["native"] = TextBox3.Text;dr["religion"] = TextBox4.Text;dr["cast1"] = TextBox5.Text;dr["name"] = TextBox6.Text;dr["otherdept"] = TextBox7.Text;dr["healthhist"] = TextBox8.Text;dr["leasureactivities"] = TextBox9.Text;dr["location"] = DropDownList1.SelectedItem.ToString();ds.Tables["personalinfo"].Rows.Add(dr);
da.Update(ds, "personalinfo");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";
TextBox6.Text = "";TextBox7.Text = "";TextBox8.Text = "";TextBox9.Text = "";
Response.Redirect("addressdetails.aspx");
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} publicvoid count()
{da = newSqlDataAdapter("select * from personalinfo", con);da.Fill(ds, "personalinfo");i = ds.Tables["personalinfo"].Rows.Count;i = i + 1;TextBox1.Text = i.ToString();
}}
-
7/31/2019 ERPM DOC
84/96
Coding for addressdetails:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclassaddressdetails : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet(); int i; protectedvoid Page_Load(object sender, EventArgs e)
{count();
} protectedvoid Button1_Click(object sender, EventArgs e)
{da = newSqlDataAdapter("select * from addressdetails", con);
SqlCommandBuilder cb = newSqlCommandBuilder(da);da.Fill(ds, "addressdetails");
DataRow dr;dr = ds.Tables["addressdetails"].NewRow();dr["sno"] = TextBox1.Text;dr["name"] = TextBox2.Text;dr["permaddress"] = TextBox3.Text;dr["phone1"] = TextBox4.Text;dr["faxno"] = TextBox5.Text;dr["emailid"] = TextBox6.Text;
-
7/31/2019 ERPM DOC
85/96
dr["phone2"] = TextBox7.Text;dr["permcity"] = TextBox8.Text;dr["presentadd"] = TextBox9.Text;dr["presentcity"] = TextBox10.Text;ds.Tables["addressdetails"].Rows.Add(dr);da.Update(ds, "addressdetails");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";TextBox6.Text = "";TextBox7.Text = "";TextBox8.Text = "";TextBox9.Text = "";TextBox10.Text = "";
Response.Redirect("educationdetails.aspx");
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} publicvoid count()
{da = newSqlDataAdapter("select * from addressdetails", con);da.Fill(ds, "addressdetails");i = ds.Tables["addressdetails"].Rows.Count;i = i + 1;TextBox1.Text = i.ToString();
}}
Coding for education details:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;
-
7/31/2019 ERPM DOC
86/96
using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclasseducationdetails : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet(); int i; protectedvoid Page_Load(object sender, EventArgs e)
{count();
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{
da = newSqlDataAdapter("select * from educationdetails", con); SqlCommandBuilder cb = newSqlCommandBuilder(da);
da.Fill(ds, "educationdetails"); DataRow dr;
dr = ds.Tables["educationdetails"].NewRow();dr["sno"] = TextBox1.Text;dr["name"] = TextBox2.Text;dr["qualification"] = TextBox3.Text;dr["passedyear"] = TextBox4.Text;dr["division"] = DropDownList1.SelectedItem.ToString();dr["instaddress"] = TextBox6.Text;dr["city"] = TextBox7.Text;dr["marks"] = TextBox8.Text;dr["majorsubs"] = DropDownList2.SelectedItem.ToString();dr["awards"] = TextBox10.Text;ds.Tables["educationdetails"].Rows.Add(dr);da.Update(ds, "educationdetails");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";TextBox6.Text = "";TextBox7.Text = "";TextBox8.Text = "";TextBox10.Text = "";
Response.Redirect("expdetails.aspx");
} publicvoid count()
{da = newSqlDataAdapter("select * from educationdetails", con);da.Fill(ds, "educationdetails");i = ds.Tables["educationdetails"].Rows.Count;i = i + 1;TextBox1.Text = i.ToString();
-
7/31/2019 ERPM DOC
87/96
}}
Coding for Experience Details:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclassexpdetails : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet(); int i; protectedvoid Page_Load(object sender, EventArgs e)
{count();
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("Home.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{da = newSqlDataAdapter("select * from expdetails", con);
SqlCommandBuilder cb = newSqlCommandBuilder(da);da.Fill(ds, "expdetails");
DataRow dr;dr = ds.Tables["expdetails"].NewRow();
-
7/31/2019 ERPM DOC
88/96
dr["sno"] = TextBox1.Text;dr["sname"] = TextBox2.Text;dr["startdate"] = TextBox3.Text;dr["enddate"] = TextBox4.Text;dr["orgaddress"] = TextBox5.Text;dr["designation"] = TextBox6.Text;dr["city"] = TextBox7.Text;dr["remunation"] = TextBox8.Text;ds.Tables["expdetails"].Rows.Add(dr);da.Update(ds, "expdetails");Response.Write("alert('Updated Successfully')");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";TextBox5.Text = "";TextBox6.Text = "";TextBox7.Text = "";TextBox8.Text = "";
}
publicvoid count(){
da = newSqlDataAdapter("select * from expdetails", con);da.Fill(ds, "expdetails");i = ds.Tables["expdetails"].Rows.Count;i = i + 1;TextBox1.Text = i.ToString();
}}
-
7/31/2019 ERPM DOC
89/96
Intervieew assignment:
using System;using System.Data;
using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;
publicpartialclassinterviewassignment : System.Web.UI.Page{ protectedvoid Page_Load(object sender, EventArgs e)
{
} protectedvoid LinkButton1_Click(object sender, EventArgs e)
{Response.Redirect("juniormanagement.aspx");
} protectedvoid LinkButton2_Click(object sender, EventArgs e)
{Response.Redirect("seniormanagement1.apx");
}}
-
7/31/2019 ERPM DOC
90/96
Assignment for junior level:
using System;using System.Data;using System.Configuration;
using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;publicpartialclassjuniormanagement : System.Web.UI.Page{ SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet();
protectedvoid Page_Load(object sender, EventArgs e){
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{da = newSqlDataAdapter("select * from juniormanagement", con);
SqlCommandBuilder cb = newSqlCommandBuilder(da);da.Fill(ds, "juniormanagement");
DataRow dr;dr = ds.Tables["juniormanagement"].NewRow();dr["sno"] = TextBox1.Text;dr["sname"] = TextBox2.Text;dr["achivementmotivation"] = DropDownList1.SelectedItem.ToString();dr["teamspirit"] = DropDownList2.SelectedItem.ToString();dr["jobknowledge"] = DropDownList3.SelectedItem.ToString();dr["problemsolving"] = DropDownList4.SelectedItem.ToString();dr["interviewdate"] = TextBox3.Text;dr["positioncode"] = TextBox4.Text;
-
7/31/2019 ERPM DOC
91/96
dr["interviewer"] = TextBox5.Text;dr["planning"] = DropDownList5.SelectedItem.ToString();dr["objectivity"] = DropDownList6.SelectedItem.ToString();dr["communication"] = DropDownList7.SelectedItem.ToString();dr["assertment"] = DropDownList8.SelectedItem.ToString();ds.Tables["juniormanagement"].Rows.Add(dr);da.Update(ds, "juniormanagement");
TextBox1.Text = "";TextBox2.Text = "";TextBox3.Text = "";TextBox4.Text = "";TextBox5.Text = "";
Response.Write("alert('successfully added')"); }}
Senior management interview report:
using System;using System.Data;
using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;
publicpartialclassseniormanagement1 : System.Web.UI.Page{ protectedvoid Page_Load(object sender, EventArgs e)
{
} protectedvoid LinkButton1_Click(object sender, EventArgs e)
{Response.Redirect("reports.aspx");
} protectedvoid LinkButton2_Click(object sender, EventArgs e)
{Response.Redirect("overallperformance.aspx");
}}
-
7/31/2019 ERPM DOC
92/96
Coding for report:
using System;using System.Data;
using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;
publicpartialclassreports : System.Web.UI.Page{ protectedvoid Page_Load(object sender, EventArgs e)
{
} protectedvoid Button3_Click(object sender, EventArgs e)
{Response.Redirect("login.aspx");
} protectedvoid Button1_Click(object sender, EventArgs e)
{Response.Redirect("selectionlistreport.aspx");
} protectedvoid Button2_Click(object sender, EventArgs e)
{
}
}
-
7/31/2019 ERPM DOC
93/96
Selection list report:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;
publicpartialclassselectionlistreport : System.Web.UI.Page{
protectedvoid Page_Load(object sender, EventArgs e){
} protectedvoid Button1_Click(object sender, EventArgs e)
{Response.Redirect("written.aspx");
} protectedvoid Button2_Click(object sender, EventArgs e)
{Response.Redirect("intimation.aspx");
}}
-
7/31/2019 ERPM DOC
94/96
List from Writtn Text:
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;
publicpartialclasswritten : System.Web.UI.Page{
SqlConnection con = newSqlConnection("userid=sa;password=sa;database=ERPM1"); SqlDataAdapter da; DataSet ds = newDataSet(); protectedvoid Page_Load(object sender, EventArgs e)
{
} protectedvoid Button1_Click(object sender, EventArgs e)
{da=newSqlDataAdapter("select * from addressdetails where
name='"+TextBox1.Text+"'",con);da.Fill(ds, "addressdetails");
GridView1.DataSource = ds.Tables["addressdetails"];GridView1.DataBind();}
}
-
7/31/2019 ERPM DOC
95/96
CONCLUSION
All the objectives that had been charted out in the initial phases were
achieved successfully.
System Features:
System satisfies all the requirements for which the company developed the
system. System has strong security. System is fully GUI based. It is easy operate and
user friendly. Platform includes the inbuilt backup and recovery facility.
Working on the project was a good experience. Working together in teams
helped us to communicate better. We understand the importance of planning and
designing as a part of software development.
The concept of peer-reviews helped to rectify the problems as and when they
occurred and also helped us to get some valuable suggestions that were incorporated
by us. Developing the project has helped us to gain some experienced on real time
development procedures.
-
7/31/2019 ERPM DOC
96/96
8.1 BIBLOGRAPHY
1. C# .NET : C# .NET Unleashed
2. SQL Server 2000 : SQL Unleashed