E-Farming.doc

81
E-Farming DATA FLOW DIAGRAMS

Transcript of E-Farming.doc

E-Farming

E-Farming

DATA FLOW DIAGRAM

Data Flow Diagramming is a means of representing a system at any level of detail with a graphic network of symbols showing data flows, data stores, data processes, and data sources/destination.

The data flow diagram is analogous to a road map. It is a network model of all possibilities with different detail shown on different hierarchical levels. This processes of representing different details level is called leveling or partitioning by some data flow diagram advocates. Like a road map, there is no starting point or stop point, no time or timing, or steps to get somewhere. We just know that the data path must exist because at some point it will be needed. A road map shows all existing or planned roads because the road is needed.

Details that is not shown on the different levels of the data flow diagram such as volumes, timing, frequency, etc. is shown on supplementary diagrams or in the data dictionary. For example, data store contents may be shown in the data dictionary.

Data Flow Diagram (DFD) uses a number of symbols to represent the systems. Data Flow Diagram also known as Bubble Chart is used to clarify system requirements and identifying the major transformations that will become programs in system design. So it is the starting point of the design phase that functionally decomposes the requirements specifications down to the level of details.

Terms used in DFD

Process

A process transforms data values. The lowest level processes are pure functions without side effects. An entire data flow graphics high level process.

Graphical representation:

Data flows

A data flow connects the output of an object or process to input of another object or process. It represents the intermediate data value within a computation. It is represented by an arrow and labeled with a description of data, usually its name or type.

Actors

An actor is active object that drives the data flow graph by producing or consuming values. Data store

A data store is a passive object with in a data flow diagram that stores data for later access.

External Entity

A rectangle represents an external entity such as a librarian ,a library member.

OutPut Symbol

This box represented data production during human computer interaction

V. Program Architecture

This software based on concept of the Model View Controller (MVC) design pattern.

MVC gives you a nice separation between your data (model), logic/business layer (controller) and display (view). In theory, this allows you to change individual layers without affecting the other layers. You could have a configurable data source (MYSQL Server, XML, etc.) or have various controllers depending on the user (admin, anon, etc.) or have various views (console, webform, winform). MVC makes it really easy to implement change which is what programming in the real world is all about. These three objects are known as Model, View and Controller.

VIEW

View is the graphical data presentation (outputting) irrespective of the real data processing. View is the responsible for look and feel, some custom formatting, sorting etc. View is completely isolated from actual complex data operations. It simply gets final row-data from the model and puts some cosmetics and formatting before displaying it in browser. View provides interface to interact with the system. The beauty of MVC approach is that it supports any kind of view, which is challenging in todays distributed and multi-platform environment.A MVC model can have multiple views, which are controlled by controller. View interface can be of WEB-FORMS, HTML, XML/XSLT, XTML, and WML or can be Windows forms etc.

MODEL

It is the domain-specific representation of the information on which the application operates. Domain logic adds meaning to raw data.

Many applications use a persistent storage mechanism (such as a database) to store data. MVC does not specifically mention the data access layer because it is understood to be underneath or encapsulated by the Model.

CONTROLLER

Controller is responsible for Notice of action. Controller responds to the mouse or keyboard input to command model and view to change. Controllers are associated with views. User interaction triggers the events to change the model, which in turn calls some methods of model to update its state to notify other registered views to refresh their display.

BENEFITS OF MVC DESIGN

Following are the few of the benefits of MVC design pattern.

Since MVC handles the multiple views using the same enterprise model it is easier to maintain, test and upgrade the multiple system.

It will be easier to add new clients just by adding their views and controllers.

Since the Model is completely decoupled from view it allows lot of flexibilities to design and implement the model considering reusability and modularity. This model also can be extended for further distributed application.

It is possible to have development process in parallel for model, view and controller.

Our Project Follows the following Architecture:

Project Requirements

I.) Software Requirements

II.)Hardware RequirementsClient Side

ProcessorRAMDisk Space

Internet Explorer

6.0Pentium II at 500 MHz64 MB1 GB

Server Side

wampPentium IV at 1 GHz

MYSQL Pentium IV at 1 GHz

Organization profile

III.) EVENT TRACE DIAGRAM

event trace diagram for user login (Student & Placement

Officer)

main page login page login record (db) student or po page

event trace diagram for recruiters

main page

recruiters page

The main objective of this project is building a website which will help Indian farmers to make the effective cultivation by providing up-to-date information and make a path to earn more money from Indian villages by sell their products to different cities online.

Here if suppose some village farmers want to use this facility and want to learn how is it possible and how they can use e-farming to sell their products, If they have knowledge of computer then they can directly register in the site and sell their product otherwise they can contact company's computer professional who will schedule classes to teach them basics of computers and internet like how they can open this site and register with it and sell their products online etc.

On the other side, wholesaler from town can also register and buy products as per their needs.

Modules1. News2. Weather3. Classifieds4. New products

5. Trails & Info6.Research

7.Loan8. MarketNews:

It provides needed latest news for farmers. It contains 2 modules

i. E-farming news

ii. Agricultural news

iii. E-farming announcementsWeather:i. Rain Gauge

ii. Weather & observations

iii. Wind & marine forecasts

Classifieds:

i. Property

ii. Machinery

iii. Produce

iv. General

New products:

This module contains the information about the new inventions for agriculture.

Trails & info:

This module gives the information about researches in agric and new useful inventions for best and effective cultivation.Community:

This module specially made for the articles based on agric information from the important news points.

Market:

This is the place for selling goods online by former without any inter mediator.

i. Admin Moduleii. Users Farmers, wholesaleriii. Inventory

iv. cold storage

v. Product acceptance

vi. Order form

vii. Taking the consumer's credit card

viii. Acknowledging receipt of the order

ix. DeliveryData Flow DiagramLevel 0

Level 1

Level 2

2.1 Project Planning and Scheduling

2.1.1Project Development Approach

This project is given to me by company to fulfill the clients requirement. Now-a-days E-Business and online Shopping is very growing industry. So everybody wants that his business is becomes global and worldwide. Our client wants to develop the website that can provide facility to their customer to buy the product online. For the website an extra-ordinary and attractive look is very much necessary for attract the new customer and second thing is very goof functionalities. The user can easily find the product through the category and subcategory wise. The user can also show the full description about the product and also it is place order, and pay the bill online. Like this various other features can be executed with the minimum efforts and accordingly database design has been setup for this website.

Software process model

To solve actual problems in industry setting, software engineer or a team of engineers must incorporate a development strategy that encompasses the process, methods and tools layers and generic phases. This strategy is often referred to as process model or a software engineering paradigm. A process model for software engineering chosen based on the nature of the project and application, the method and tools to be used, and the controls and deliverables are required.

2.1.2Project Planning

Software Project scheduling is an activity that distributes estimated efforts across the planned duration by allocating the efforts to specific software engineering tasks.

For this project, we have collected the project requirements from client. Also the requirements are examined from different aspects of the developments requirements. Then the whole project is divided in to modules and according to that modules the development work has been done. The DDs are designed according to the requirements. For each module, the designs have been developed and then it is implemented.

2.1.3Schedule representation

2.2 Risk Management:

Risk management is the process of measuring, or assessing, risk and developing strategies to manage it. Strategies include transferring the risk to another party avoiding the risk, reducing the negative effect of the risk, and accepting some or all of the consequences of a particular risk. Traditional risk management focuses on risks stemming from physical or legal causes (e.g. natural disaster or fires, accidents, death and lawsuits). Financial risk management, on the other hand, focuses on risk that can be managed using traded financial instruments.

In ideal risk management, a prioritization process is followed whereby the risks with lower probability of occurring are handled first, and risk with lower probability of occurrence and lower loss are handled later.

Step in the risk management process establishing the context involves

Planning the remainder of the process.

Mapping out the following: the scope of the exercise, the identity and objectives of stakeholders and the basis upon which risks will be evaluated.

Defining a framework for the process and an agenda for the Identification.

Developing an analysis of risk involved in the process.

2.2.1 Risk Identification:

After establishing the context, the next step in the process of managing risk is to identify potential risks. Risks are about events that, when triggered, cause problems. Hence, risk identification can start with the source of problems, or with the problem itself.

In this project there can be following risks:

The order risk is associated with the software. If in the software the wrong user is authorized by mistake then he may do changes that cause the system in dangerous mode. There can be risk of natural threats.

2.2.2 Risk Analysis:

Once risks have been identified, they must then be assessed as to their potential severity of loss and to the probability of occurrence. Regardless of the prevention techniques employed, possible threats that could arise inside and outside the organization need to be assessed. Regardless of the type of threat, the goals of the business recovery planning are to ensure the safety of customers, employees and other personal during the following a disaster.

The relative probability of a disaster occurring should be determined. Here by the first risk can occur because of the less of communication with all branches of Apollo for requirement fulfillment. For example, the company may not have interacted with the branch of Apollo in U.S.A. and that branch needs some additional functionality of the software.

If by mistake any person threat Administrator password then he can change the data in software and can leak information. Same thing occurs if the wrong user is authorized. The software may be in problem by natural threat e.g. internal flooding,

External flooding, internal fire, external fire etc.2.2.3 Disaster Prevention:

Because a goal of business recovery planning is to ensure the safety of personnel and assets during the following a disaster, a critical aspect of the risk analysis process is to identify the preparedness and preventive measures in place at any point in time. Once the potential areas of high exposure to the organization are identify, additional preventative measures can be considered for implementation.

Disaster prevention and preparedness begins at the top of organization. The attitude of senior management toward security and prevention should permeate the entire organization. Therefore, managements support of disaster planning can focus attention on good security and prevention techniques and better prepare the organization for unwelcome and unwanted.

Disaster prevention techniques include two categories

Procedural prevention

Physical prevention

Procedural prevention relates to activities performed on a day-to-day, month-to-month, or annual basis, relating to security and recovery. Procedural prevention begins with assigning responsibility for overall security of the organization to an individual with adequate competence and authority to meet the challenges. The objective of procedural prevention is to define activities necessary to prevent various types of disasters and ensure that these activities are performed regularly.

Physical prevention and preparedness for disaster beings when a site is constructed. It includes special requirement for building construction, as well as fire protection for various equipment components. Special considerations include: computer area, fire detection and extinguishing system, records protection, air conditioning, heating, and ventilation, electrical supply and UPS system, emergency procedures, vault storage area(s), archival systems.

2.2.4 Risk Planning:

Once risks have been identified and assessed, all techniques to manage the risk fall into one or more of these four major categories:

Tolerate (retention)

Treat (mitigation)

Terminate (elimination)

Transfer (buying insure)

Ideal use of these strategies may not be possible. Some of them may involve tread-offs that are not acceptable to the organization or a person making the risk management decisions.

2.2.5 Risk avoidance:

Includes not performing an activity that could carry risk. An example would be not buying a property or business in order to not take on the liability that comes with it. Another would be not flying in order to not to take the risk that the airplanes was to be hijacked. Avoidance may seem the answer to all risk, but avoiding risk also means losing out on the potential gain that accepting (retaining) the risk may have allowed, not entering a business to avoid the risk of loss also avoid the possibility of earning profits.2.2.6 Risk reduction:

Involves methods that reduce the severity of the loss. Examples include sprinklers designed to put out a fire to reduce the risk of loss by fire. This method may cause a greater loss by water damage and therefore may not be suitable. Hal on fire suppression systems may mitigate that risk, but the cost may be prohibitive as a strategy. Modern software development methodologies reduce risk by developing and delivering software incrementally. Early methodologies suffered from the fact that they only delivered software in the final phase of development; any problems encountered in earlier phases meant costly rework and often jeopardized the whole project. By developing in iteration, software project can limit effort wasted to a single iteration. A current trend in software development, spearheaded by the extreme programming community, is to reduce the size of iteration to the smallest size possible, sometimes as little as one week is allocated to an iteration.

1 . - System Analysis

1.2 Objective

A key objective of this system is that it consolidates the data in one central location. This will enable easy management of all information and ensure data integrity across the entire breadth of the system. The online order management system will provide such a facility for ordering any product so that he can easily get the product and also can inquiry online. And it is also very useful to customers because it saves their time and money.

To shopping via Internet and online.

Customer can see all the items which available in the shop. And get the full detail information about the particular product.

Customer has no need to gone for a shopping in outside the home.

It is save the time of the customer.

Customer can also pay for the purchased Product.

To provide facility for if product not found then give detail of this product to admin. Admin best try for provide that product.

The main objective of the project is to create a system that allows users to order the product based on name of that product. The selected products are displayed in a tabular format and the user can order their products online through credit card Payment.1.3 Current Scenario

Currently the system is static so the customer can just see the product online but they cant order it. Customer has to do it manually and payment is also manually done. So we are trying to make it dynamic and we will replace the current ordering system to the latest technology.

Drawback of current system

Since the ordering of the products is done offline currently it is time consuming.

Customer has to pay somewhat more also.

1.4 Proposed solution

The system will be made such that the customer can order the product online and payment of that can also be done online through credit card, visa card etc. So the new system will save time and money of the customer.

1.5 Preliminary Analysis

1.1.1 Scope

Due to Internet access customers uses Internet more and more for their shopping needs. They also save time and money by doing so, and for more of their work related activity.

This system should provide facility to register online for online ordering.

The system should provide the facility for inquiry if the customer has any query.

The system should provide facility for online payment.

This System should be able to generate customer related report.

1.1.2 Feasibility Study

A feasibility study is carried out to select the best system that meets performance requirements. A feasibility study is designed to provide an over view of the primary issues related to a business idea. The purpose is to identify any make or break issues that would prevent your business from being successful in the market place. In other words, a feasibility study determines whether the ideas make sense.

Feasibility study provides a lot of information necessary for the business plan. For example a good market is necessary in order to determine the projects feasibility. This information provides the basis for the market section of the business plan.

Technical Feasibility

The system will be hosted on the internet so company need internet connection in that organization, as well as they need MS access to store the data related to products and customers.

System can be accessed by any platform no concern with open source community or Microsoft Technology, only thing needed is that .Net Framework must be installed at application server. I am able to complete the project within specified time.

This system can be easily supported by the hardware and software requirements of any system. The system can also produce its required output.

The proposed system must provide adequate responses to inquires, regardless of the number or locations of the users. There must be technical guarantees of accuracy, reliability, ease of access, and data security.

Economical Feasibility:

Economical feasibility addresses to the following issues:

software resource requirement of the proposed system is .NET Framework 2.0 and MS Access that are already owned by the organization and do not require additional Investment.

For declaring that the system is economically feasible, the benefits obtain from the system has to be related against the cost incurred to actually develop the system because the software which are used here is already owned by the company. It needs the server which can run this system which is already company has.The hardware requirement of the system is at least a PC for an administrator to handle the site from admin panel. The development cost of the project is not much higher.

Behavioral Feasibility:

Proposed system will behave according to the requirement made or not. Responses time of the system must be noticed because it is a web based system whether it takes too much time to response or give quick response or response in the specific period of time this consideration is most important. Time consuming processes are possible to run on this environment or not it is also important. Operational Feasibility:

The proposed system will meet the operational requirements like system Performance, accessibility of information, client acceptance and efficient solutions to the queries of the user.If user has some basic knowledge of Internet, user can operate this application easily. It provides easy user interface.

Operational feasibility has been considered from the users point of view. This application once deployed, can run easily without any maintenance at this point of time. After the inclusion of database in future, the database might need some clean up after some period of time. If the database size becomes large, then it might need some changes in handling of the application and might require some optimization so that application runs faster and retrieves data faster.

1.6 System Requirements

Hardware Configuration:

Server Configuration:Standard Pentium series processor.

Minimum 4 MB RAM.

HDD storage capacity of 360GB with 5400 rpm or more.

Client Configuration:

Any computer system with normal speed.

Internet Connectivity

Software Requirements:

Server Software:

Visual Studio .Net 2005 (Frame work 2.0)

Windows 2000 or higher OS.

Client Software:VGA or higher resolution monitor,

IE 6.0+,Fire fox 2.0+, Flash.

System Requirement Study:

The requirements can be classified as below.

1) Functional requirements.

2) Non-functional requirements.

Functional Requirements:

DATABASE FUNCTION

1) Inserting the product and customer information.

2) Retrieval of the stored data according to the user needs.

3) Delete of stored information of outdated products which are out of market and customer needs.

4) Updating the stored information according to the changes in products change and customer information.

SYSTEM FUNCTIONSMain features of the system are as follows.

1) To provide the facility to register a new customer.

2) To provide the facility to customer to add product to cart, remove product from cart, and also to provide the facility to calculate total price.

3) To provide the facility to conform order and shipping.

4) To provide facility to customer to see account, edit account; edit credit card detail, change password.5) To provide facility to administrator to add customer, add category, add product, edit bill info, edit credit card info, View daily order, edit ship info.6) To provide the facility to customer for the inquiry of product. Non Functional Requirements:

EfficiencyThe system must provide easy and fast access without consuming time and resources.

ReliabilityUser should never be surprised by the behavior of the system and it should also provide meaningful feedback when occur and provide context sensitive user help facility so that user can recover from the errors. The system should be available whenever user demanding for the service.

PortabilityThe system must be platform independent, network independent, and hardware independent. For example for running the system not any extra hardware is required.

Constraints:

Payment through e-check/DD needs gateway services as of high cost. In credit card validation, all the visa card, master card and American express card provided primary level validation but secondary level validation is of high cost. The user has to agree with the policies of the retailer. The Customer must have some ID like passport, Social security number, voting card, PAN card etc.There is wide use of images low range of RAM is little inconvenient to make it faster. To get result very fast and appropriate it needs to have high speed of RAM.2 .- Technology Used

Introduction to .NET

Visual Studio.NET is a complete set of development tools for building ASP Web applications, XML Web services, desktop applications, and mobile applications. Visual Basic.NET, Visual C++.NET, and Visual C#.NET all use the same integrated development environment (IDE), which allows them to share tools and facilitates in the creation of mixed-language solutions. In addition, these languages leverage the functionality of the .NET Framework, which provides access to key technologies that simplify the development of ASP Web applications and XML Web services.

SHAPE \* MERGEFORMAT

The .NET Framework is an integral Windows component that supports building and running the next generation of applications and XML Web services. The .NET Framework is designed to fulfill the following objectives:

To provide a consistent object-oriented programming environment whether object code is stored and executed locally, executed locally but Internet-distributed, or executed remotely.

To provide a code-execution environment that minimizes software deployment and versioning conflicts.

To provide a code-execution environment that promotes safe execution of code, including code created by an unknown or semi-trusted third party.

To provide a code-execution environment that eliminates the performance problems of scripted or interpreted environments.

To make the developer experience consistent across widely varying types of applications, such as Windows-based applications and Web-based applications.

To build all communication on industry standards to ensure that code based on the .NET Framework can integrate with any other code.

The .NET Framework has two main components: the common language runtime and the .NET Framework class library. The common language runtime is the foundation of the .NET Framework. You can think of the runtime as an agent that manages code at execution time, providing core services such as memory management, thread management, and remoting, while also enforcing strict type safety and other forms of code accuracy that promote security and robustness. In fact, the concept of code management is a fundamental principle of the runtime. Code that targets the runtime is known as managed code, while code that does not target the runtime is known as unmanaged code. The class library, the other main component of the .NET Framework, is a comprehensive, object-oriented collection of reusable types that you can use to develop applications ranging from traditional command-line or graphical user interface (GUI) applications to applications based on the latest innovations provided by ASP.NET, such as Web Forms and XML Web services.

The .NET Framework can be hosted by unmanaged components that load the common language runtime into their processes and initiate the execution of managed code, thereby creating a software environment that can exploit both managed and unmanaged features. The .NET Framework not only provides several runtime hosts, but also supports the development of third-party runtime hosts.

For example, ASP.NET hosts the runtime to provide a scalable, server-side environment for managed code. ASP.NET works directly with the runtime to enable ASP.NET applications and XML Web services, both of which are discussed later in this topic.

Internet Explorer is an example of an unmanaged application that hosts the runtime (in the form of a MIME type extension). Using Internet Explorer to host the runtime enables you to embed managed components or Windows Forms controls in HTML documents. Hosting the runtime in this way makes managed mobile code (similar to Microsoft ActiveX controls) possible, but with significant improvements that only managed code can offer, such as semi-trusted execution and isolated file storage.

The following illustration shows the relationship of the common language runtime and the class library to your applications and to the overall system. The illustration also shows how managed code operates within a larger architecture.

.NET Framework in context

The following sections describe the main components and features of the .NET Framework in greater detail.

Features of the Common Language Runtime (CLR)The common language runtime manages memory, thread execution, code execution, code safety verification, compilation, and other system services. These features are intrinsic to the managed code that runs on the common language runtime.

With regards to security, managed components are awarded varying degrees of trust, depending on a number of factors that include their origin (such as the Internet, enterprise network, or local computer). This means that a managed component might or might not be able to perform file-access operations, registry-access operations, or other sensitive functions, even if it is being used in the same active application.

The runtime enforces code access security. For example, users can trust that an executable embedded in a Web page can play an animation on screen or sing a song, but cannot access their personal data, file system, or network. The security features of the runtime thus enable legitimate Internet-deployed software to be exceptionally feature rich.

The runtime also enforces code robustness by implementing a strict type-and-code-verification infrastructure called the common type system (CTS). The CTS ensures that all managed code is self-describing. The various Microsoft and third-party language compilers generate managed code that conforms to the CTS. This means that managed code can consume other managed types and instances, while strictly enforcing type fidelity and type safety.

In addition, the managed environment of the runtime eliminates many common software issues. For example, the runtime automatically handles object layout and manages references to objects, releasing them when they are no longer being used. This automatic memory management resolves the two most common application errors, memory leaks and invalid memory references.

The runtime also accelerates developer productivity. For example, programmers can write applications in their development language of choice, yet take full advantage of the runtime, the class library, and components written in other languages by other developers. Any compiler vendor who chooses to target the runtime can do so. Language compilers that target the .NET Framework make the features of the .NET Framework available to existing code written in that language, greatly easing the migration process for existing applications.

While the runtime is designed for the software of the future, it also supports software of today and yesterday. Interoperability between managed and unmanaged code enables developers to continue to use necessary COM components and DLLs.

The runtime is designed to enhance performance. Although the common language runtime provides many standard runtime services, managed code is never interpreted. A feature called just-in-time (JIT) compiling enables all managed code to run in the native machine language of the system on which it is executing. Meanwhile, the memory manager removes the possibilities of fragmented memory and increases memory locality-of-reference to further increase performance.

Finally, the runtime can be hosted by high-performance, server-side applications, such as Microsoft SQL Server and Internet Information Services (IIS). This infrastructure enables you to use managed code to write your business logic, while still enjoying the superior performance of the industry's best enterprise servers that support runtime hosting.

Compilation Process of .Net Technology

Figure 2 Language Compilation in .NET

.NET Framework Class LibraryThe .NET Framework class library is a collection of reusable types that tightly integrate with the common language runtime. The class library is object oriented, providing types from which your own managed code can derive functionality. This not only makes the .NET Framework types easy to use, but also reduces the time associated with learning new features of the .NET Framework. In addition, third-party components can integrate seamlessly with classes in the .NET Framework.

For example, the .NET Framework collection classes implement a set of interfaces that you can use to develop your own collection classes. Your collection classes will blend seamlessly with the classes in the .NET Framework.

As you would expect from an object-oriented class library, the .NET Framework types enable you to accomplish a range of common programming tasks, including tasks such as string management, data collection, database connectivity, and file access. In addition to these common tasks, the class library includes types that support a variety of specialized development scenarios. For example, you can use the .NET Framework to develop the following types of applications and services:

Console applications.

Windows GUI applications (Windows Forms).

ASP.NET applications.

XML Web services.

Windows services.

For example, the Windows Forms classes are a comprehensive set of reusable types that vastly simplify Windows GUI development. If you write an ASP.NET Web Form application, you can use the Web Forms classes.

Accessing data with ADO.NET ADO.NET provides consistent access to data sources such as Microsoft SQL Server, as well as data sources exposed through OLE DB and XML. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, manipulate, and update data.

ADO.NET cleanly factors data access from data manipulation into discrete components that can be used separately or in tandem. ADO.NET includes .NET Framework data providers for connecting to a database, executing commands, and retrieving results. Those results are either processed directly, or placed in an ADO.NET Dataset object in order to be exposed to the user in an ad-hoc manner, combined with data from multiple sources, or remote between tiers. The ADO.NET Dataset object can also be used independently of a .NET Framework data provider to manage data local to the application or sourced from XML.

The ADO.NET classes are found in System.Data.dll, and are integrated with the XML classes found in System.Xml.dll. When compiling code that uses the System.Data namespace, reference both System.Data.dll and System.Xml.dll.

ADO.NET provides functionality to developers writing managed code similar to the functionality provided to native COM developers by ADO.

ADO.NET ComponentsThe ADO.NET components have been designed to factor data access from data manipulation. There are two central components of ADO.NET that accomplish this: the Dataset, and the .NET Framework data provider, which is a set of components including the Connection, Command, DataReader, and DataAdapter objects.

The ADO.NET Dataset is the core component of the disconnected architecture of ADO.NET. The Dataset is explicitly designed for data access independent of any data source. As a result it can be used with multiple and differing data sources, used with XML data, or used to manage data local to the application. The Dataset contains a collection of one or more DataTable objects made up of rows and columns of data, as well as primary key, foreign key, constraint, and relation information about the data in the DataTable objects.

The other core element of the ADO.NET architecture is the .NET Framework data provider, whose components is explicitly designed for data manipulation and fast, forward-only, read-only access to data. The Connection object provides connectivity to a data source. The Command object enables access to database commands to return data, modify data, run stored procedures, and send or retrieve parameter information. The DataReader provides a high-performance stream of data from the data source. Finally, the DataAdapter provides the bridge between the DataSet object and the data source. The DataAdapter uses Command objects to execute SQL commands at the data source to both load the DataSet with data, and reconcile changes made to the data in the DataSet back to the data source.

You can write .NET Framework data providers for any data source. The .NET Framework ships with two .NET Framework data providers: the .NET Framework Data Provider for SQL Server and the .NET Framework Data Provider for OLE DB.

The following diagram illustrates the components of ADO.NET architecture.ADO.NET architecture

Platform Invoke (Windows API)

Platform invoke relies on metadata to locate exported functions and marshal their arguments at run time. The following illustration shows this process.

A platform invoke call to an unmanaged DLL function

When platform invoke calls an unmanaged function, it performs the following sequence of actions:

1. Locates the DLL containing the function.

2. Loads the DLL into memory.

3. Locates the address of the function in memory and pushes its arguments onto the stack, marshaling data as required. Why we Use C#.Net:

It supports Client/Server Architecture. C#.Net also provides Database Objects like ADO.Net which is very useful for making Client/server application. It gives more facility like disconnected database structure with classes like Data Adapter, Data connection.

C#.Net is object oriented language which is providing facility of Inheritance, constructors, destructors, multithreading etc. C#.Net provides many data types which are giving flexibility in programming. It also provides the Crystal Report support to make report this is the advantage of C#.Net. In our application reports are very important part with graphical representation.

The most important feature of C#.Net is disconnected database structure. That features is very much useful in our application and it also give speed and accuracy to the client/server model.

Features of C#.Net:

Inheritance: C Sharp .Net supports inheritance by allowing you to define classes that serve as the basis for derived classes. Derived classes inherit and can extend the properties and methods of the base class. They can also override inherited methods with new implementations. All classes created with Visual Basic .Net are inheritable by default. Because the forms you design are really classes, you can use inheritance to define new forms based on existing ones.

Exception Handling: C Sharp .Net supports structured exception handling, using and enhanced version of the TryCatchFinally syntax supported by other languages such as c++. Structured exception handling combines a modern control structure with exceptions, protected blocks of code and filters. Structured exception handling makes it easy to create and maintain programs with robust comprehensive error handlers.

Overloading: Overloading is the ability to define properties, methods, or procedures that have the same name but use different data types. Overloaded procedures allow you to provide as many implementations as necessary to handle different kinds of data, while giving the appearance of a single, versatile procedure.

Overriding Properties and Methods: The overrides keyword allows derived objects to override characteristics inherited from Parent objects. Overridden members have the same arguments as the members inherited form the base class, but different Implementations. A members new implementation can call the original implementation in the parent class by preceding the member name with My Base.

Constructors and Destructors: Constructors are procedures that control initialization of new instances of a class. Conversely, destructors are methods that free system resources when a class leaves scope or is set to nothing. C Sharp .Net supports constructors and destructors using the sub new and sub finalize procedures.

Data Types: C Sharp .Net introduces three new data types. The char data type is and unsigned 16-bit quantity used to store Unicode characters. It is equivalent to the .Net Framework System. Char data type.

Interfaces: Interfaces describe the properties and methods of classes, but unlike classes, do not provide implementations. The interface statement allows you to declare interfaces, while the implements statement lets you write code that puts the items described in the interface into practice.

Shared Members: Shared members are properties, procedures, and fields that are shared by all instances of a class. Shared data members are useful when multiple by objects need to use information that is common to all. Shared class methods can be used without first creating and object form a class.

References: References allow you to use objects defined in other assemblies. In C Sharp .Net, references point to assemblies instead of type libraries.

Namespaces: Namespaces prevent naming conflicts by organizing classes, interfaces, and methods into hierarchies.

Assemblies: Assemblies replace and extend the capabilities of type libraries by, describing all the required files for a particular component or application. An assembly can contain one or more namespaces.

Attributes: Attributes enable you to provide additional information about program elements. For example, you can use an attribute to specify which methods in a class should be exposed when the class is used as a XML Web service.

Multithreading: C Sharp .Net allows you to write applications that can perform multiple tasks independently. A task that has the potential of holding up other tasks can execute on a separate thread, a process known as multithreading. By causing complicated tasks to run on threads that are separate from your user inter face, multithreading makes your applications more responsive to user input.

Bit Shift Operators: C Sharp .Net now supports arithmetic left and right shift operations on integral data types. Arithmetic shifts are not circulars, which means the bits shifted off one end of the result are not reintroduced at the other and. The corresponding assignment operators are provided as well.

MS SQL SERVER 2005 Express Edition Overview

Microsoft SQL Server 2005 Express Edition extends the performance, reliability, quality, and ease-of-use of Microsoft SQL Server version 7.0. Microsoft SQL Server 2005 Express Edition includes several new features that make it an excellent database platform for large-scale online transactional processing (OLTP), data warehousing, and e-commerce applications. The OLAP Services feature available in SQL Server version 7.0 is now called SQL Server 2005 Express Edition Analysis Services. The term OLAP Services has been replaced with the term Analysis Services. Analysis Services also includes a new data mining component

Microsoft SQL Server 2005 Express Edition is a set of components that work together to meet the data storage and analysis needs of the largest Web sites and enterprise data processing systems. The topics in SQL Server Architecture describe how the various components work together to manage data effectively.

Internet Integration

The SQL Server 2005 Express Edition database engine includes integrated XML support. It also has the scalability, availability, and security features required to operate as the data storage component of the largest Web sites. The SQL Server2005 Express Edition programming model is integrated with the Windows DNA architecture for developing Web applications, and SQL Server 2005 Express Edition supports features such as English Query and the Microsoft Search Service to incorporate user-friendly queries and powerful search capabilities in Web applications.

Scalability and Availability

The same database engine can be used across platforms ranging from laptop computers running Microsoft Windows 98 through large, multiprocessor servers running Microsoft Windows 2000 Data Center Edition. SQL Server 2005 Express Edition supports features such as federated servers, indexed views, and large memory support that allow it to scale to the performance levels required by the largest Web sites.

Enterprise-Level Database Features

The SQL Server 2005 Express Edition relational database engine supports the features required to support demanding data processing environments. The database engine protects data integrity while minimizing the overhead of managing thousands of users concurrently modifying the database.

SQL Server 2005 Express Edition includes a set of administrative and development tools that improve upon the process of installing, deploying, managing, and using SQL Server across several sites. SQL Server 2000 also supports a standards-based programming model integrated with the Windows DNA, making the use of SQL Server databases.

Data warehousing

SQL Server 2005 Express Edition includes tools for extracting and analyzing summary data for online analytical processing. SQL Server also includes tools for visually designing databases and analyzing data using English-based questions.

Database Architecture

Microsoft SQL Server 2005 Express Edition data is stored in databases. The data in a database is organized into the logical components visible to users. A database is also physically implemented as two or more files on disk.

By using a database, it is possible to work primarily with the logical components such as tables, views, procedures, and users. The physical implementation of files is largely transparent. Typically, only the database administrator needs to work with the physical implementation.

Each instance of SQL Server has four system databases (master, model, tempdb, and msdb) and one or more user databases. Some organizations have only one user database, containing all the data for their organization. Some organizations have different databases for each group in their organization, and sometimes a database used by a single application.

It is not necessary to run multiple copies of the SQL Server database engine to allow multiple users to access the databases on a server. An instance of the SQL Server Standard or Enterprise Edition is capable of handling thousands of users working in multiple databases at the same time.

When connecting to an instance of SQL Server, the connection is associated with a particular database on the server. This database is called the current database. The user is usually connected to a database defined as the default database by the system administrator, although its uses connection options in the database APIs to specify another database.

SQL Server 2005 Express Edition allows detaching databases from an instance of SQL Server, then reattaching them to another instance, or even attaching the database back to the same instance. If there is SQL Server database file, it is possible to attach that database file with a specific database name.

Relational Database components:

The database component of Microsoft SQL Server 2005 Express Edition is a Structured Query Language (SQL)based, scalable, relational database with integrated Extensible Markup Language (XML) support for Internet applications. Each of the following terms describes a fundamental part of the architecture of the SQL Server2005 Express Edition database component:

DatabaseA database is similar to a data file in that it is a storage place for data. Like a data file, a database does not present information directly to a user; the user runs an application that accesses data from the database and presents it to the user in an understandable format. Database systems are more powerful than data files in that data is more highly organized.

In a well-designed database, there are no duplicate pieces of data that the user or application must update at the same time. Related pieces of data are grouped together in a single structure or record, and relationships can be defined between these structures and records. Relational DatabaseAlthough there are different ways to organize data in a database, relational databases are one of the most effective. Relational database systems are an application of mathematical set theory to the problem of effectively organizing data. In a relational database, data is collected into tables (called relations in relational theory).

ScalableSQL Server 2005 Express Edition supports having a wide range of users access it at the same time. An instance of SQL Server 2005 Express Edition includes the files that make up a set of databases and a copy of the DBMS software. Applications running on separate computers use a SQL Server 2005 Express Edition communications component to transmit commands over a network to the SQL Server 2005 Express Edition instance. When an application connects to an instance of SQL Server 2005 Express Edition, it can reference any of the databases in that instance that the user is authorized to access. The communication component also allows communication between an instance of SQL Server 2005 Express Edition and an application running on the same computer.

Structured Query LanguageTo work with data in a database, the user have to use a set of commands and statements (language) defined by the DBMS software. Several different languages can be used with relational databases; the most common is SQL. The American National Standards Institute (ANSI) and the International Standards Organization (ISO) define software standards.

Extensible Markup Language XML is the emerging Internet standard for data. XML is a set of tags that can be used to define the structure of a hypertext document. XML documents can be easily processed by the Hypertext Markup Language, which is the most important language for displaying Web pages.

Database Design Considerations

Designing a database requires an understanding of both the business functions you want to model and the database concepts and features used to represent those business functions.

It is important to accurately design a database to model the business because it can be time consuming to change the design of a database significantly once implemented. A well-designed database also performs better.

Database Architecture

Microsoft SQL Server 2005 Express Edition data is stored in databases. The data in a database is organized into the logical components visible to users. A database is also physically implemented as two or more files on disk.

When using a database, you work primarily with the logical components such as tables, views, procedures, and users. The physical implementation of files is largely transparent. Typically, only the database administrator needs to work with the physical implementation.

Each instance of SQL Server has four system databases (master, model, tempdb, and msdb) and one or more user databases. Some organizations have only one user database, containing all the data for their organization. Some organizations have different databases for each group in their organization, and sometimes a database used by a single application. For example, an organization could have one database for sales, one for payroll, one for a document management application, and so on. Sometimes an application uses only one database; other applications may access several databases.

It is not necessary to run multiple copies of the SQL Server database engine to allow multiple users to access the databases on a server. An instance of the SQL Server Standard or Enterprise Edition is capable of handling thousands of users working in multiple databases at the same time. Each instance of SQL Server makes all databases in the instance available to all users that connect to the instance, subject to the defined security permissions.

When connecting to an instance of SQL Server, your connection is associated with a particular database on the server. This database is called the current database. You are usually connected to a database defined as your default database by the system administrator, although you can use connection options in the database APIs to specify another database. You can switch from one database to another using either the Transact-SQL USE database_name statement, or an API function that changes your current database context.

SQL Server 2000 allows you to detach databases from an instance of SQL Server, then reattach them to another instance, or even attach the database back to the same instance. If you have a SQL Server database file, you can tell SQL Server when you connect to attach that database file with a specific database name.

MS Visio 2003Microsoft Visio is a Graphical tool used for the Flow chart, DFD and ER. It contains all the shapes for the Flow chart, DFD, ER etc. that a user can drag and drop onto the drawing page to create a drawing, without having to draw anything manually. Shapes can have online Help to assist a user in using them correctly.

The following visual solutions are included:

Block Diagram Includes the Basic, Block, and Block with Perspective templates. These are useful for showing all types of relationships and hierarchies and provide the basic arsenal of information graphics tools.

Brainstorming Includes the new Brainstorming diagram that allows you to capture, arrange, and expand ideas generated by a group or yourself. These diagrams display hierarchical relationships and allow exportation to Word for a more linear view, or to an Extensible Markup Language (XML) file for reuse elsewhere.

Business Process The new Business Process category provides a collection of templates you can use for specific business process documentation efforts, including Six Sigma, SAP, and International Organization for Standardization (ISO).

Building Plan Provides a quick way to design accurate, to-scale office and furniture layouts.

Charts and Graph Formerly Forms and Charts. Includes templates for designing business forms; creating quick pie, line, and bar charts and graphs; and creating marketing diagrams.

Flowchart Includes templates for creating audit diagrams, basic flowcharts, cause and effect diagrams, cross-functional flowcharts, mind mapping diagrams, total quality management (TQM) charts, and workflow diagrams.

Map Includes templates for creating simple street maps and attractive 3-D maps.

Network Includes shapes designed to resemble common network topology and devices. Useful for planning and documenting small to medium-sized networks.

Organization Chart Includes intelligent shapes that know their position in an organization, so that reporting structures stay in place. You can even use the Organization Chart Wizard to automatically build a chart from a spreadsheet or database without having to draw a thing.

Project Schedule Includes templates for creating PERT charts, Gantt charts, timelines, and calendars, so you can keep your projects on track.

Building PlanIncludes templates for creating plan-view drawings of corporate offices and industrial manufacturing facilities. Designed for space planners and building engineers, this solution lets you create floor plans, home plans, plant layouts, reflected ceiling plans, site plans, and the building services schematics that support them.

Database Includes templates for communicating database designs using multiple notations intended for database professionals. With the Database Model Diagram template, you can even reverse engineer and get support for leading client/server and desktop databases.

Electrical Engineering Includes a variety of templates used by electrical engineers for creating electrical and electronic schematics, wiring diagrams, and logic diagrams.

Mechanical Engineering Includes templates for diagramming fluid power control systems and hydraulic or pneumatic circuits as well as part and assembly drawings.

Network Includes further templates for creating high-level, logical diagrams and for designing local area networks (LANs), wide area networks, wiring closets, server rooms, and telecommunications structures. In addition, you can create diagrams of Microsoft Active Directory, Novell Directory Services (NDS), and other LDAP-based directory structures.

Process Engineering Includes templates for assembling detailed piping and instrumentation diagrams (P&IDs) and process flow diagrams (PFDs) used by many chemical and industrial engineers.

Software Includes templates for major object-oriented software notations, including the full Unified Modeling Language (UML) 1.2 notation. In addition, you can diagram data flows, Windows user interfaces, COM and OLE objects, and more.

Web Diagram Includes templates for automatically mapping Web sites and conceptual shapes for planning new designs

DATA FLOW DIAGRAMS

Graphical Representation:

Graphical Representation:

Graphical Representation:

Graphical Representation:

Graphical Representation:

SQL Server 2005

.aspx.cs

Database

Data Access Layer

Business Logic Layer

Presentation Layer

.aspx

1. Click on login

2. Enter User Id

3. Enter Password

4. Validate check

5. Verify UID password

7. Wrong Password

6. Control goes to respective pages

8. Forget Password

9. Control goes to respective pages

10. Login Record Update

1. Click on Recruiters Login

Admin

Login

Add News

Add Weather information

Product details

Store Data

User

Register

Login

View News

Gather about E-Farming

Store Data

Retireve Data

User

Purchase

Store Data

Order details

Payment details

Web Form

.NET Framework

Windows

Web Service

.NET FoundationWeb Services

The InternalWeb Service

Third-PartyWeb Services

.NET EnterpriseServers

Clients

Applications

Source Code in C#

Source Code in Another .NetLanguage

Source Code in VB 2005

VB 2005 Compiler (vbc.exe)

C# Compiler (csc.exe)

Appropriate Compiler

DLL or EXE file in IL (intermediate language code)

JIT (just-in-time compiler)

Native Machine Code

Execute

_1491045985.vsdTasks

1

Title

Text

Sr. No

Phases

Start

Finish

Duration