Bioinformatics lab data analysis system

1. INTRODUCTION

1.1 ABSTRACT

This project is entitles as “Bio informatics lab data analysis systemis developed using ASP.NET as front end C# as coding language and SQL Server as back end. Data details will be furnished using Data grid and MS charts. Data validation will be done through java script.

The main objective of this project is to develop a Client Server based environment, for transferring the medical report details from the bio medical lab. Also to improve data analysis technique, using previous data comparison methods.

This system works on accurate recording of all user report transactions lead to better data management and increases in various comparisons. This project consists of two types of interactions like admin and patient. Admin will be providing with a well secured login and patient details will be registered using admin. A user name and password will be provided to the user. The initially process is to enter the medical test requirement of the patient. The details contain prescribed doctor details, patient details and lab test details. Each disease will contain different CTA (Complete test analysis) for their test result. For example a general blood sugar test contains HBL cholesterol, LDL cholesterol, Glucose level and etc. These all said to be CTA test.

Once patient has been register successfully their blood will be collected and a temporary reference number will be provided. This temporary number is for admin and employee reference. Once their blood test has been done, the test result will be uploaded by the admin by selected the corresponding user name. The uploading details will be stored in a centralized server for data utility purpose.

Patient can login with their user and password. Through selecting the date, the patient can view their current test result from their home itself. A test history option will be provided to the patient. So that patient can compare their test with their previous test results. This option makes the patient to manage and know about their health condition and their treatment. An interactive grid and chart will be generated for furnishing the result to the patient. Admin can maintain their patient detail, patient count, test results and etc. This makes more comfortable communication between admin and patients. Also entire treatment record will be computerized for future references.

1.2 MODULES

  1. Admin login and patient creation
  2. Upload Test Reports
  3. User View
  4. Upload contents
  5. Chart and grid

1.3 MODULES DESCRIPTION:

Admin Login

The initial module of this project is admin module. Here admin will be provided with a user name and password. There is also possibility for password changing option inside the admin login. The admin will be fully authorized person for this entire project. In added with he will be fully operational authority of this project. The admin can able to access all the option in this project and he can able to do all types of updating. And also admin can able to create patient and provide their username and password.

Upload Test Reports

This module is under the control of admin. A temporary reference number will be generated for patients. After the completion of lab test, the test report details will be uploaded in the patient zone. The uploading data will be centralized in server. The uploading details will be categorized into date wise, doctor wise, Patient wise, disease wise and etc. Each and every test contains sub test details. Full payment will be collected from the user side while patient giving their blood to the lab advisor.

User View

Using the user name and password, the patient can login anywhere at any time. All the uploaded contents can be viewed in the user’s login. A change password option will be provided to the user to make their treatment history more secured.

Upload Content

A premium option will be provide to the patient in this module. In case the patient has been taking treatment in other place means, after their treatment, patient can upload their new lab report with the existing treatment report. Data merging technique has been implemented for merging the existing data with the current data. Not even other lab details, they can upload various medical records, which can be used in their future.

Chart and Grid

This is the data display module, various grid and charts can be generated from the user side. This gives the data more clarity to access. Patient can select from date and to date to view their treatment history. All data will be compared in charts and graphs for clarity results. Using this option patient can monitor their health. And they will get more confidence on their treatment.

2. SYSTEM SPECFICATION

2.1 HARDWARE SPECIFICATION

PROCESSOR : Intel Pentium Dual Core 1.8 GHz

MOTHERBOARD : Intel 915GVSR chipset board

RAM : 4 GB DDR3 RAM

HARD DISK DRIVE : 750 GB

DVD/CD DRIVE : Sony 52 x Dual layer drive

MONITOR : 17” Color TFT Monitor

KEYBOARD : Multimedia Keyboard 108 Keys

MOUSE : Logitech Optical Mouse

CABINET : ATX iball.

2.2 SOFTWARE CONFIGURATION

FRONTEND : ASP.NET 2012

CODING LANDUAGE : C#

BACK END : SQL SERVER 2010

CLIENT SERVER TOOL : AJAX 2.0

OPERATING SYSTEMS : Microsoft Windows 7

DOCUMENTATION : Microsoft word 2007.

SCRIPTING LANGUAGE : Java Script

3. SYSTEM STUDY

3.1 EXISTING SYSTEM

In the system both bio lab admin and patients are facing more problems due to manual works. In case the blood sample form the patient is received means, they need to wait for the medical result or they need to come again to receive their medical reports. This makes the patient more inconvenient to travel and wait for the result. In some times the patient may miss their report, at that time they need to visit again to the lab for getting their report again. Also some times the patient may miss their previous report , which may need to track their medical history. In case of verifying the patient history with the manual paper reports means, they need to compare their reports manually. Admin also facing more problems in the existing system like updating the test details and managing the customer details. Still doing more paper works with problems

Some of the bio labs are computerized like maintaining their customer details and proving the reports in a computer printout. But they still typing their result in a word file and proving printouts to the patients. They are not maintaining proper patient details with their previous medical reports. This is the most important problems facing by the bio lab now a day.

3.1.1 DISADVANTAGES OF THE EXISTING SYSTEM

  • Only Manual work
  • No computerized patient report
  • No customized login for patients to receive their medical report
  • No comparison for the records with the previous records
  • Patients may miss their reports
  • Admin facing more problem in adding new disease to the database
  • Problem in maintain customers

3.2 PROPOSED SYSTEM

All the drawbacks in the existing system have been over came in the proposed system. The important of the proposed system in computerization of the manual work to an automated process. In the proposed system the patient need not visit the lab frequently. Once the blood sample has been collected from the patient, the lab will start their process. The lab admin will be directly entering the reports details into the patient corresponding username. This data will be transfer to the patient’s login directly. Also all the patients will be provided with a user name and a password. Patient can directly login in the link and they can view their test report immediately.

For the admin part, no paper works or manual print outs will be done. All the details will be computerized in this application. Lab can provide both software copy and hardcopy to the patients. What even data has been provided will be computerized in the system. Even after some year a particular patient’s details can be fetched out.

3.2.1 ADVANTAGES OF THE PROPOSED SYSTEM

  • Fully automated work
  • All patient details and report details will be computer
  • Individual login will be provided to patients with user name and password
  • Reports can be compared by the patient in their login itself. Patient will get aware about their treatment while comparing their records.
  • No possible way to miss their records, everything will be computerized.
  • In case of adding new disease details, admin can add or remove the disease details with admin login
  • Any customers can be tracked easily and can easily maintain their details also

4. SYSTEM DESIGN

4.1 DATA FLOW DIAGRAM

Level 0:

Request for username & password

Admin / customer

Database

Bio medical Lab server

Response

L

Create Test Details Manage Test Details evel 1:

Username , password

adlogintbl

Login

Testname , range

newtesttbl

Edit , Delete

Create Customer

Name , age , username,password

Admin / user

Upload Test Result

View TestRresult

Change Password

customertbl

updateresulttbl

User id , name , labtest

Name , view test

Old password , new password

L

Login evel 2:

customertbl

Username , password

View Test Result

Id , data , testname , result

Customer

id , name

updateresulttbl

Compare Result

Change Password

Old password, new password

4.2 ER DIAGRAM

4.3 TABLE DESIGN

Table Name : adlogintbl

Primary key : username

FieldnameDatatypeSizeConstraintsDescription
UsernameVarchar50Primary keyAdmin username
PasswordVarchar20Not NullAdmin password

Table Name : customertbl

Primary key : username

FieldnameDatatypeSizeConstraintsDescription
nameVarchar50Primary keyCustomer name
genderVarchar20Not NullCustomer gender
ageInt10Not NullCustomer age
bloodgroupVarchar50Not NullCustomer blood group
street1Varchar100Not NullCustomer street1
street2Varchar100Not NullCustomer street2
cityVarchar50Not NullCustomer city
stateVarchar50Not NullCustomer state
pinInt10Not NullCustomer pinno
phoneInt10Not NullCustomer phoneno
emailVarchar50Not NullCustomer email
usernameVarchar50Not NullCustomer username
passwordVarchar20Not NullCustomer password

Table Name : newtesttbl

Primary key : testname

FieldnameDatatypeSizeConstraintsDescription
testnamevarchar50Primary keyCreate testname
subtestnamevarchar50Not NullCreate subtestname
frmrangeint10Not NullFrom range
fmeasurementvarchar50Not NullFrom measurement
torangeint10Not NullTo range
tmeasurementvarchar50Not NullTo measurement

Table Name : updateresulttbl

Primary key : id

Foreign key : username

Foreign key : username

FieldnameDatatypeSizeConstraintsDescription
Idint10Primary keyIdentification number
usernamevarchar50Foreign keyCustomer username
namevarchar50Not NullCustomer name
testnamevarchar50Not NullCustomer testname
subtestnameint50Not NullCustomer subtestname
labtestvarchar50Not NullCustomer labtest
labtestvalueint10Not NullCustomer labtestvalue
frmrangeint10Not NullFrom range
fmeasurementvarchar50Not NullFrom measurement
torangeint10Not NullTo range
tmeasurementvarchar50Not NullTo measurement
resultvarchar50Not NullTest result
updateDatetimeNot NullUpdate date
uptimeDatetimeNot NullUpdate time

5. ABOUT SOFTWARE

5.1 ABOUT FRONT END

Asp.net is a server-side web application framework designed for web development to produce dynamic web pages. It was developed by Microsoft to allow programmers to build dynamic web sites, web applications and web services. It was first released in January 2002 with version 1.0 of the.net framework, and is the successor to Microsoft’s active server pages (asp) technology. Asp.net is built on the common language runtime (clr), allowing programmers to write asp.net code using any supported .net language. The asp.net soap extension framework allows asp.net components to process soap messages.

After four years of development, and a series of beta releases in 2000 and 2001, asp.net 1.0 was released on January 5, 2002 as part of version 1.0 of the .net framework. Even prior to the release, dozens of books had been written about asp.net, and Microsoft promoted it heavily as part of its platform for web services. Scott Guthrie became the product unit manager for asp.net, and development continued apace, with version 1.1 being released on April 24, 2003 as a part of windows server 2003. This release focused on improving asp. Net’s support for mobile devices.

Characteristics

Asp.net web pages, known officially as web forms, are the main building blocks for application development. web forms are contained in files with a “.aspx” extension; these files typically contain static (x)html markup, as well as markup defining server-side web controls and user controls where the developers place all the rc content[further explanation needed] for the web page. Additionally, dynamic code which runs on the server can be placed in a page within a block <% — dynamic code — %>, which is similar to other web development technologies such as php, jsp, and asp. With asp.net framework 2.0, Microsoft introduced a new code-behind model which allows static text to remain on the .aspx page, while dynamic code remains in an .aspx.vb or .aspx.cs or .aspx.fs file (depending on the programming language used).

Directives

A directive is a special instruction on how asp.net should process the page.the most common directive is <%@ page %> which can specify many attributes used by the asp.net page parser and compiler.

Examples

Inline code[edit source | editbeta]

<%@ page language=”c#” %>

<!Doctype html public “—//w3c//dtd xhtml 1.0 //en”

“http://www.w3.org/tr/xhtml1/dtd/xhtml1-transitional.dtd”>

<script runat=”server”>

protected void page_load(object sender, eventargs e)

{ // assign the datetime to label control

lbl1.text = datetime.now.tolongtimestring();

}

</script>

<html xmlns=”http://www.w3.org/1999/xhtml”>

<head runat=”server”>

<title>sample page</title>

</head>

<body>

<form id=”form1″ runat=”server”>

Code-behind solutions[edit source | editbeta]

<%@ page language=”c#” codefile=”samplecodebehind.aspx.cs” inherits=”website.samplecodebehind”

Autoeventwireup=”true” %>

The above tag is placed at the beginning of the aspx file. The code file property of the @ page directive specifies the file (.cs or .vb or .fs) acting as the code-behind while the inherits property specifies the class from which the page is derived. In this example, the @ page directive is included in samplecodebehind.aspx, then (samplecodebehind.aspx.cs) acts as the code-behind for this page:

Source language c#:

Using system;

Namespace website

{

public partial class samplecodebehind : system.web.ui.page

{

protected void page_load(object sender, eventargs e)

{

response.write(“hello, world”);

}

}

}

Source language visual basic.net:

Imports system

Namespace website

public partial class samplecodebehind

inherits system.web.ui.page

protected sub page_load(byval sender as object, byval e as eventargs)

response.write(“hello, world”)

end sub

end class

End namespace

In this case, the page_load() method is called every time the aspx page is requested. The programmer can implement event handlers at several stages of the page execution process to perform processing.

User controls

User controls are encapsulations of sections of pages which are registered and used as controls in asp.net, etc.

Custom controls

Programmers can also build custom controls for asp.net applications. Unlike user controls, these controls do not have an ascx markup file, having all their code compiled into a dynamic link library (dll) file. Such custom controls can be used across multiple web applications and visual studio projects.

Rendering technique

Asp.net uses a visited composites rendering technique. During compilation, the template (.aspx) file is compiled into initialization code which builds a control tree (the composite) representing the original template. Literal text goes into instances of the literal control class, and server controls are represented by instances of a specific control class. The initialization code is combined with user-written code (usually by the assembly of multiple partial classes) and results in a class specific for the page. The page doubles as the root of the control tree.

Actual requests for the page are processed through a number of steps. First, during the initialization steps, an instance of the page class is created and the initialization code is executed. This produces the initial control tree which is now typically manipulated by the methods of the page in the following steps. As each node in the tree is a control represented as an instance of a class, the code may change the tree structure as well as manipulate the properties/methods of the individual nodes. Finally, during the rendering step a visitor is used to visit every node in the tree, asking each node to render itself using the methods of the visitor. The resulting html output is sent to the client.

After the request has been processed, the instance of the page class is discarded and with it the entire control tree. This is a source of confusion among novice asp.net programmers who rely on the class instance members that are lost with every page request/response cycle.

State management

Asp.net applications are hosted by a web server and are accessed using the stateless http protocol. As such, if an application uses state full interaction, it has to implement state management on its own. Asp.net provides various functions for state management. Conceptually, Microsoft treats “state” as gui state. Problems may arise if an application needs to keep track of “data state”; for example, a finite-state machines which may be in a transient state between requests (lazy evaluation) or which takes a long time to initialize. State management in asp.net pages with authentication can make web scraping difficult or impossible.

Application

Application state is held by a collection of shared user-defined variables. These are set and initialized when the application_onstart event fires on the loading of the first instance of the application and are available until the last instance exits. Application state variables are accessed using the applications collection, which provides a wrapper for the application state. Application state variables are identified by name.

Session state

Server-side session state is held by a collection of user-defined session variables that are persistent during a user session. These variables, accessed using the session collection, are unique to each session instance. The variables can be set to be automatically destroyed after a defined time of inactivity even if the session does not end. Client-side user session is maintained by either a cookie or by encoding the session id in the url itself.

Asp.net supports three modes of persistence for server-side session variables:

In-process mode

The session variables are maintained within the asp.net process. This is the fastest way; however, in this mode the variables are destroyed when the asp.net process is recycled or shut down.

Asp state mode

Asp.net runs a separate windows service that maintains the state variables. Because state management happens outside, the asp.net process, and because the asp.net engine accesses data using .net removing, asp state is slower than in-process. This mode allows an asp.net application to be load-balanced and scaled across multiple servers. Because the state management service runs independently of asp.net, the session variables can persist across asp.net process shutdowns. However, since session state server runs as one instance, it is still one point of failure for session state. The session-state service cannot be load-balanced, and there are restrictions on types that can be stored in a session variable.

5.2 ABOUT BACK END

SQL Server is Microsoft’s relational database management system (RDBMS). It is a full-featured database primarily designed to compete against competitors Oracle Database (DB) and MySQL. 

Like all major RBDMS, SQL Server supports ANSI SQL, the standard SQL language. However, SQL Server also contains T-SQL, its own SQL implemention.SQL Server Management Studio (SSMS) (previously known as Enterprise Manager) is SQL Server’s main interface tool, and it supports 32-bit and 64-bit environments.


SQL Server is sometimes referred to as MSSQL and Microsoft SQL Server.

Originally released in 1989 as version 1.0 by Microsoft, in conjunction with Sybase, SQL Server and its early versions were very similar to Sybase. However, the Microsoft-Sybase partnership dissolved in the early 1990s, and Microsoft retained the rights to the SQL Server trade name. Since then, Microsoft has released 2000, 2005 and 2008 versions, which feature more advanced options and better security. 

Examples of some features include: XML data type support, dynamic management views (DMVs), full-text search capability and database mirroring.SQL Server is offered in several editions with different feature set and pricing options to meet a variety of user needs, including the following:

Enterprise: Designed for large enterprises with complex data requirements, data warehousing and Web-enabled databases. Has all the features of SQL Server, and its license pricing is the most expensive. 

Standard: Targeted toward small and medium organizations. Also supports e-commerce and data warehousing.

Workgroup: For small organizations. No size or user limits and may be used as the backend database for small Web servers or branch offices.

Express: Free for distribution. Has the fewest number of features and limits database size and users. May be used as a replacement for an Access database.

Mainstream editions

Datacenter

SQL Server 2008 R2 Datacenter is the full-featured edition of SQL Server and is designed for datacenters that need the high levels of application support and scalability. It supports 256 logical processors and virtually unlimited memory. Comes with StreamInsight Premium edition. The Datacenter edition has been retired in SQL Server 2012, all its features are available in SQL Server 2012 Enterprise Edition.

Enterprise

SQL Server Enterprise Edition includes both the core database engine and add-on services, with a range of tools for creating and managing a SQL Server cluster. It can manage databases as large as 524 megabytes and address 2 terabytes of memory and supports 8 physical processors. SQL 2012 Enterprise Edition supports 160 Physical Processors 

Standard

SQL Server Standard edition includes the core database engine, along with the stand-alone services. It differs from Enterprise edition in that it supports fewer active instances (number of nodes in a cluster) and does not include some high-availability functions such as hot-add memory (allowing memory to be added while the server is still running), and parallel indexes.

Web

SQL Server Web Edition is a low-TCO option for Web hosting.

Business Intelligence

Introduced in SQL Server 2012 and focusing on Self Service and Corporate Business Intelligence. It includes the Standard Edition capabilities and Business Intelligence tools: PowerPivot, Power View, the BI Semantic Model, Master Data Services, Data Quality Services and xVelocity in-memory analytics.

Workgroup

SQL Server Workgroup Edition includes the core database functionality but does not include the additional services. Note that this edition has been retired in SQL Server 2012.

Express

SQL Server Express Edition is a scaled down, free edition of SQL Server, which includes the core database engine. While there are no limitations on the number of databases or users supported, it is limited to using one processor, 1 GB memory and 4 GB database files (10 GB database files from SQL Server Express 2008 R2). It is intended as a replacement for MSDE. Two additional editions provide a superset of features not in the original Express Edition. The first is SQL Server Express with Tools, which includes SQL Server Management Studio Basic. SQL Server Express with Advanced Services adds full-text search capability and reporting services.

Specialized editions

Azure

Microsoft SQL Azure Database is the cloud-based version of Microsoft SQL Server, presented as software as a service on Azure Services Platform.

Compact (SQL CE)

The compact edition is an embedded database engine. Unlike the other editions of SQL Server, the SQL CE engine is based on SQL Mobile (initially designed for use with hand-held devices) and does not share the same binaries. Due to its small size (1 MB DLL footprint), it has a markedly reduced feature set compared to the other editions. For example, it supports a subset of the standard data types, does not support stored procedures or Views or multiple-statement batches (among other limitations). It is limited to 4 GB maximum database size and cannot be run as a Windows service, Compact Edition must be hosted by the application using it. The 3.5 version includes support for ADO.NET Synchronization Services. SQL CE does not support ODBC connectivity, unlike SQL Server proper.

Developer

SQL Server Developer Edition includes the same features as SQL Server 2012 Enterprise Edition, but is limited by the license to be only used as a development and test system, and not as production server. This edition is available to download by students free of charge as a part of Microsoft’s DreamSpark program

Evaluation

SQL Server Evaluation Edition, also known as the Trial Edition, has all the features of the Enterprise Edition, but is limited to 180 days, after which the tools will continue to run, but the server services will stop.

Fast Track

SQL Server Fast Track is specifically for enterprise-scale data warehousing storage and business intelligence processing, and runs on reference-architecture hardware that is optimized for Fast Track.

LocalDB

Introduced in SQL Server Express 2012, LocalDB is a minimal, on-demand, version of SQL Server that is designed for application developers. It can also be used as an embedded database.

Parallel Data Warehouse (PDW)

Data warehouse Appliance Edition

Pre-installed and configured as part of an appliance in partnership with Dell & HP base on the Fast Track architecture. This edition does not include SQL Server Integration Services, Analysis Services, or Reporting Services.

Architecture

The protocol layer implements the external interface to SQL Server. All operations that can be invoked on SQL Server are communicated to it via a Microsoft-defined format, called Tabular Data Stream (TDS). TDS is an application layer protocol, used to transfer data between a database server and a client. Initially designed and developed by Sybase Inc. for their Sybase SQL Server relational database engine in 1984, and later by Microsoft in Microsoft SQL Server, TDS packets can be encased in other physical transport dependent protocols, including TCP/IP, Named pipes, and Shared memory. Consequently, access to SQL Server is available over these protocols. In addition, the SQL Server API is also exposed over web services.

Data storage

Data storage is a database, which is a collection of tables with typed columns. SQL Server supports different data types, including primary types such as Integer, Float, Decimal, Char (including character strings), Varchar (variable length character strings), binary (for unstructured blobs of data), Text (for textual data) among others. The rounding of floats to integers uses either Symmetric Arithmetic Rounding or Symmetric Round Down (Fix) depending on arguments: SELECT Round(2.5, 0) gives 3.

Microsoft SQL Server also allows user-defined composite types (UDTs) to be defined and used. It also makes server statistics available as virtual tables and views (called Dynamic Management Views or DMVs). In addition to tables, a database can also contain other objects including views, stored procedures, indexes and constraints, along with a transaction log. A SQL Server database can contain a maximum of 231 objects, and can span multiple OS-level files with a maximum file size of 260 bytes. The data in the database are stored in primary data files with an extension .mdf. Secondary data files, identified with a .ndf extension, are used to store optional metadata. Log files are identified with the .ldf extension.

Storage space allocated to a database is divided into sequentially numbered pages, each 8 KB in size. A page is the basic unit of I/O for SQL Server operations. A page is marked with a 96-byte header which stores metadata about the page including the page number, page type, free space on the page and the ID of the object that owns it. Page type defines the data contained in the page – data stored in the database, index, allocation map which holds information about how pages are allocated to tables and indexes, change map which holds information about the changes made to other pages since last backup or logging, or contain large data types such as image or text. While page is the basic unit of an I/O operation, space is actually managed in terms of anextent which consists of 8 pages. A database object can either span all 8 pages in an extent (“uniform extent”) or share an extent with up to 7 more objects (“mixed extent”). A row in a database table cannot span more than one page, so is limited to 8 KB in size. However, if the data exceeds 8 KB and the row contains Varchar or Varbinary data, the data in those columns are moved to a new page (or possibly a sequence of pages, called an Allocation unit) and replaced with a pointer to the data.

For physical storage of a table, its rows are divided into a series of partitions (numbered 1 to n). The partition size is user defined; by default all rows are in a single partition. A table is split into multiple partitions in order to spread a database over a cluster. Rows in each partition are stored in either B-tree or heap structure. If the table has an associated index to allow fast retrieval of rows, the rows are stored in-order according to their index values, with a B-tree providing the index. The data is in the leaf node of the leaves, and other nodes storing the index values for the leaf data reachable from the respective nodes. If the index is non-clustered, the rows are not sorted according to the index keys. An indexed view has the same storage structure as an indexed table. A table without an index is stored in an unordered heap structure. Both heaps and B-trees can span multiple allocation units.

Buffer management

SQL Server buffers pages in RAM to minimize disc I/O. Any 8 KB page can be buffered in-memory, and the set of all pages currently buffered is called the buffer cache. The amount of memory available to SQL Server decides how many pages will be cached in memory. The buffer cache is managed by the Buffer Manager. Either reading from or writing to any page copies it to the buffer cache. Subsequent reads or writes are redirected to the in-memory copy, rather than the on-disc version. The page is updated on the disc by the Buffer Manager only if the in-memory cache has not been referenced for some time. While writing pages back to disc, asynchronous I/O is used whereby the I/O operation is done in a background thread so that other operations do not have to wait for the I/O operation to complete. Each page is written along with its checksum when it is written. When reading the page back, its checksum is computed again and matched with the stored version to ensure the page has not been damaged or tampered with in the meantime.

6. SYSTEM TESTING

Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test. Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. Test techniques include, but are not limited to the process of executing a program or application with the intent of finding software bugs (errors or other defects).

Development Testing is a software development process that involves synchronized application of a broad spectrum of defect prevention and detection strategies in order to reduce software development risks, time, and costs. It is performed by the software developer or engineer during the construction phase of the software development lifecycle. Rather than replace traditional QA focuses, it augments it. Development Testing aims to eliminate construction errors before code is promoted to QA; this strategy is intended to increase the quality of the resulting software as well as the efficiency of the overall development and QA process. Software testing can be stated as the process of validating and verifying that a computer program/application/product:

  • meets the requirements that guided its design and development,
  • works as expected,
  • can be implemented with the same characteristics,
  • and satisfies the needs of stakeholders.
  • White-box testing,
  • Black-box testing,
  • Unit testing,
  • Integration testing,
  • System testing,
  • Acceptance testing,
  • Validation testing

WHITE-BOX TESTING

White-box testing  is a method of testing software that tests internal structures or workings of an application, as opposed to its functionality (i.e. black-box testing). In white-box testing an internal perspective of the system, as well as programming skills, are used to design test cases. The tester chooses inputs to exercise paths through the code and determine the appropriate outputs. This is analogous to testing nodes in a circuit, e.g. in-circuit testing (ICT).

While white-box testing can be applied at the unit, integration and system levels of the software testing process, it is usually done at the unit level. It can test paths within a unit, paths between units during integration, and between subsystems during a system–level test. Though this method of test design can uncover many errors or problems, it might not detect unimplemented parts of the specification or missing requirements.

BLACK-BOX TESTING

Black box testing is designed to validate functional requirements without regard to the internal workings of a program. Black box testing mainly focuses on the information domain of the software, deriving test cases by partitioning input and output in a manner that provides through test coverage. Incorrect and missing functions, interface errors, errors in data structures, error in functional logic are the errors falling in this category.

UNIT TESTING

Unit testing is a method by which individual units of source code, sets of one or more computer program modules together with associated control data, usage procedures, and operating procedures, are tested to determine if they are fit for use. Intuitively, one can view a unit as the smallest testable part of an application. In procedural programming a unit could be an entire module but is more commonly an individual function or procedure. In object-oriented programming a unit is often an entire interface, such as a class, but could be an individual method. Unit tests are created by programmers or occasionally by white box testers during the development process.

Ideally, each test case is independent from the others: substitutes like method stubs, mock objects, fakes and test harnesses can be used to assist testing a module in isolation. Unit tests are typically written and run by software developers to ensure that code meets its design and behaves as intended. Its implementation can vary from being very manual (pencil and paper) to being formalized as part of build automation.

INTEGRATION TESTING:

 Testing can be used in both software and hardware integration testing. The basis behind this type of integration testing is to run user-like workloads in integrated user-like environments. In doing the testing in this manner, the environment is proofed, while the individual components are proofed indirectly through their use. Usage Model testing takes an optimistic approach to testing, because it expects to have few problems with the individual components. The strategy relies heavily on the component developers to do the isolated unit testing for their product. The goal of the strategy is to avoid redoing the testing done by the developers, and instead flesh-out problems caused by the interaction of the components in the environment.

For integration testing, Usage Model testing can be more efficient and provides better test coverage than traditional focused functional integration testing. To be more efficient and accurate, care must be used in defining the user-like workloads for creating realistic scenarios in exercising the environment. This gives confidence that the integrated environment will work as expected for the target customers.

SYSTEM TESTING:

System testing takes, as its input, all of the “integrated” software components that have passed integration testing and also the software system itself integrated with any applicable hardware system(s). The purpose of integration testing is to detect any inconsistencies between the software units that are integrated together (called assemblages) or between any of the assemblages and the hardware. System testing is a more limited type of testing; it seeks to detect defects both within the “inter-assemblages” and also within the system as a whole.

ACCEPTANCE TESTING:

Acceptance test cards are ideally created during sprint planning or iteration planning meeting, before development begins so that the developers have a clear idea of what to develop. Sometimes acceptance tests may span multiple stories (that are not implemented in the same sprint) and there are different ways to test them out during actual sprints. One popular technique is to mock external interfaces or data to mimic other stories which might not be played out during iteration (as those stories may have been relatively lower business priority). A user story is not considered complete until the acceptance tests have passed.

VALIDATION TESTING:

Verification is intended to check that a product, service, or system (or portion thereof, or set thereof) meets a set of initial design specifications. In the development phase, verification procedures involve performing special tests to model or simulate a portion, or the entirety, of a product, service or system, then performing a review or analysis of the modeling results. In the post-development phase, verification procedures involve regularly repeating tests devised specifically to ensure that the product, service, or system continues to meet the initial design requirements, specifications, and regulations as time progresses. It is a process that is used to evaluate whether a product, service, or system complies with regulations, specifications, or conditions imposed at the start of a development phase. Verification can be in development, scale-up, or production. This is often an internal process.

6.2 Test case

test case normally consists of a unique identifier, requirement references from a design specification, preconditions, events, a series of steps (also known as actions) to follow, input, output, expected result, and actual result. Clinically defined a test case is an input and an expected result. This can be as pragmatic as ‘for condition x your derived result is y’, whereas other test cases described in more detail the input scenario and what results might be expected. It can occasionally be a series of steps (but often steps are contained in a separate test procedure that can be exercised against multiple test cases, as a matter of economy) but with one expected result or expected outcome.

The optional fields are a test case ID, test step, or order of execution number, related requirement(s), depth, test category, author, and check boxes for whether the test is automatable and has been automated. Larger test cases may also contain prerequisite states or steps, and descriptions. A test case should also contain a place for the actual result. These steps can be stored in a word processor document, spreadsheet, database, or other common repository. In a database system, you may also be able to see past test results, who generated the results, and what system configuration was used to generate those results. These past results would usually be stored in a separate table.

7. IMPLEMENTATION

System implementation generally benefits from high levels of user involvement and management support. User participation in the design and operation of information systems has several positive results. First, if users are heavily involved in systems design, they move opportunities to mold the system according to their priorities and business requirements, and more opportunities to control the outcome. Second, they are more likely to react positively to the change process. Incorporating user knowledge and expertise leads to better solutions.

The relationship between users and information systems specialists has traditionally been a problem area for information systems implementation efforts. Users and information systems specialists tend to have different backgrounds, interests, and priorities. This is referred to as the user-designer communications gap. These differences lead to divergent organizational loyalties, approaches to problem solving, and vocabularies

Conversions to new systems often get off track because companies fail to plan the project realistically or they don’t execute or manage the project by the plan. Remember that major systems conversions are not just IT projects. Companies should maintain joint responsibility with the vendor in the project-planning process, maintenance of the project-plan status, as well as some degree of control over the implementation.

All key user departments should have representation on the project team, including the call center, website, fulfillment, management, merchandising, inventory control, marketing and finance. Team members should share responsibilities for conversion, training and successful completion of the project tasks.

The software vendor should have a time-tested project methodology and provide a high-level general plan. As the merchant client, your job is to develop the detailed plan with the vendor, backed up with detail tasks and estimates.

For example, a generalized plan may have a list of system modifications, but lack the details that need to be itemized. These may include research, specifications, sign-offs, program specs, programming, testing and sign-off, and the various levels of testing and program integration back into the base system.

Plan for contingencies, and try to keep disruptions to the business to a minimum. We have seen systems go live and with management initially unable to get their most frequently used reports this can be a big problem.

The systems project should have a senior manager who acts as the project sponsor. The project should be reviewed periodically by the steering committee to track its progress. This ensures that senior management on down to the department managers are committed to success.

Once you have a plan that makes sense, make sure you manage by the plan. This sounds elementary, but many companies and vendors stumble on it.

Early in the project publish a biweekly status report. Once you get within a few months, you may want to have weekly conference call meetings and status updates. Within 30 days of “go live,” hold daily meetings and list what needs to be achieved.

Project management is the discipline of planning, organizing, motivating, and controlling resources to achieve specific goals. A project is a temporary endeavor with a defined beginning and end (usually time-constrained, and often constrained by funding or deliverables), undertaken to meet unique goals and objectives, typically to bring about beneficial change or added value. The temporary nature of projects stands in contrast with business as usual (or operations), which are repetitive, permanent, or semi-permanent functional activities to produce products or services. In practice, the management of these two systems is often quite different, and as such requires the development of distinct technical skills and management strategies.

The primary challenge of project management is to achieve all of the project goals and objectives while honoring the preconceived constraints. The primary constraints are scope, time, quality and budget. The secondary —and more ambitious— challenge is to optimize the allocation of necessary inputs and integrate them to meet pre-defined objectives.

Documentation:

Requirements come in a variety of styles, notations and formality. Requirements can be goal-like (e.g., distributed work environment), close to design (e.g., builds can be started by right-clicking a configuration file and select the ‘build’ function), and anything in between. They can be specified as statements in natural language, as drawn figures, as detailed mathematical formulas, and as a combination of them all.

A good architecture document is short on details but thick on explanation. It may suggest approaches for lower level design, but leave the actual exploration trade studies to other documents.

It is very important for user documents to not be confusing, and for them to be up to date. User documents need not be organized in any particular way, but it is very important for them to have a thorough index.

Change Management within ITSM (as opposed to software engineering or project management) is often associated with ITIL, but the origins of change as an IT management process predate ITIL considerably, at least according to the IBM publication A Management System for the Information Business.

In the ITIL framework, Change Management is a part of “Service Transition” – transitioning something newly developed (i.e. an update to an existing production environment or deploying something entirely new) from the Service Design phase into Service Operation (AKA Business As Usual) and aims to ensure that standardized methods and procedures are used for efficient handling of all changes.

System Installation:

Installation performed without using a computer monitor connected. In attended forms of headless installation, another machine connects to the target machine (for instance, via a local area network) and takes over the display output. Since a headless installation does not need a user at the location of the target computer, unattended headless installers may be used to install a program on multiple machines at the same time.

An installation process that runs on a preset time or when a predefined condition transpires, as opposed to an installation process that starts explicitly on a user’s command. For instance, a system willing to install a later version of a computer program that is being used can schedule that installation to occur when that program is not running. An operating system may automatically install a device driver for a device that the user connects. (See plug and play.) Malware may also be installed automatically. For example, the infamous Conficker was installed when the user plugged an infected device to his computer.

7.1 USER TRAININGS:

It forms the core of apprenticeships and provides the backbone of content at institutes of technology (also known as technical colleges or polytechnics). In addition to the basic training required for a trade, occupation or profession, observers of the labor-market recognize as of 2008 the need to continue training beyond initial qualifications: to maintain, upgrade and update skills throughout working life. People within many professions and occupations may refer to this sort of training as professional development.

  • Training Tips
    • Train people in groups, with separate training programs for distinct groups
    • Select the most effective place to conduct the training
    • Provide for learning by hearing, seeing, and doing
    • Prepare effective training materials, including interactive tutorials

Rely on previous trainees

Convert a subset of the database.  Apply incremental updates on the new system or the old system to synchronize the updates taking place.  Use this method when the system is deployed in releases.  For example, updates are applied to the old system if a pilot project is applying updates against the new system so that areas outside the pilot have access to the data.

Some commentators use a similar term for workplace learning to improve performance: “training and development“. There are also additional services available online for those who wish to receive training above and beyond that which is offered by their employers. Some examples of these services include career counselling, skill assessment, and supportive services. One can generally categorize such training as on-the-job or off-the-job:

On-the-job training method takes place in a normal working situation, using the actual tools, equipment, documents or materials that trainees will use when fully trained. On-the-job training has a general reputation as most effective for vocational work. It involves Employee training at the place of work while he or she is doing the actual job. Usually a professional trainer (or sometimes an experienced employee) serves as the course instructor using hands-on training often supported by formal classroom training.

Off-the-job training method takes place away from normal work situations — implying that the employee does not count as a directly productive worker while such training takes place. Off-the-job training method also involves employee training at a site away from the actual work environment. It often utilizes lectures, case studies, role playing and simulation, having the advantage of allowing people to get away from work and concentrate more thoroughly on the training itself. This type of training has proven more effective in inculcating concepts and ideas.

  • A more recent development in job training is the On the Job Training Plan, or OJT Plan. According to the United States Department of the Interior, a proper OJT plan should include: An overview of the subjects to be covered, the number of hours the training is expected to take, an estimated completion date, and a method by which the training will be evaluated.

8. CONCLUSION

This project has been implemented successfully and the output has been verified. All the outputs are generating according to the given input. Data validations are done according to the user and admin input data. The patient’s user name and password are generated in admin login. All the generated username and passwords are getting login successfully. While creating the test details all the data are responding properly according to the input details.

Secured login provided for both admin and patient. An individual master page has been created for both admin and patient. Data integrity has been verified wile uploading the test details. SQL data base has been successfully configured to storing the data in the database. Patients are provided with charts to verify the test history. Thus the project has been implemented successfully according to the commitment.

9. FUTURE ENHANCEMENT

Even thou the system has been enhanced and planned well; still there are some more future enhancement to be done. Due to time constrain, the project has been stopped at this phase.

This application is developed under client server technology, but this application is not hosted in the web server. Hosting this application in web server makes real time.

This application can able to host in a cloud server, this makes improvements in the performance.

A dedicated mobile application can be implement for Android, IOS and windows.

This application can be centralized and can able to tied with hospitals also. This makes the lab assistant to send the test report to the concern doctor directly.

Patients can makes tests in their free insurance schemes

More comparison chars can be created

Data backups can be done from the admin side. Admin take back up for every month.

10. BIBLIOGRAPHY

Books reffered

  1. Alistair McMonnies, “Object-oriented programming in ASP.NET”, Pearson Education, and ISBN: 81-297-0649-0, First Indian Reprint 2004.
  1. Jittery R.Shapiro, “The Complete Reference ASP.NET” Edition 2002, Tata McGraw-Hill, Publishing Company Limited, New Delhi.
  1. Robert D.Schneider, Jettey R.Garbus, “Optimizing SQL Server”, Second Edition, Pearson Education Asia, ISBN: 981-4035-20-3

Websites reffered :

  1. http://www.microsoft.com/dotnet/visual basic
  2. http://www.dotnetheaven.com
  3. http://www.codeproject.com
  4. http://www.Planetcode.com
  5. http://www.stackoverflow.com

11.SAMPLE SCREENS

Screen shots should taken by candidates only

Reason:

You can enter your name and project related information while taking Screen shots

12.SAMPLE CODE

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.Configuration;

using System.Data.SqlClient;

public partial class adcreatecustomer : System.Web.UI.Page

{

SqlConnection con;

SqlCommand cmd;

string query, gen;

public void data()

{

string connstring = WebConfigurationManager.ConnectionStrings[“connection”].ConnectionString;

con = new SqlConnection(connstring);

con.Open();

}

private int random(int min, int max)

{

Random random = new Random();

return random.Next(min, max);

}

protected void Page_Load(object sender, EventArgs e)

{

if (rdmale.Checked == true)

{

gen = “Male”;

}

else

{

gen = “Female”;

}

}

protected void txtname_TextChanged(object sender, EventArgs e)

{

lblmes.Visible = false;

txtage.Focus();

}

protected void btngenerate_Click(object sender, EventArgs e)

{

lblusername.Visible = true;

lblpassword.Visible = true;

int a = random(100, 999);

int b = random(10000, 99999);

lblusername.Text = txtname.Text.Substring(0, 3) + a.ToString();

lblpassword.Text = b.ToString();

}

protected void btncreate_Click(object sender, EventArgs e)

{

data();

query = “select username from customertbl where username='” + lblusername.Text + “‘”;

cmd = new SqlCommand(query, con);

SqlDataReader rd = cmd.ExecuteReader();

if (rd.Read())

{

lblmes2.Visible = true;

lblmes2.Text = “Already Exists”;

}

else

{

lblmes2.Visible = false;

data();

query = “insert into customertbl(name,gender,age,bloodgroup,street1,street2,city,state,pin,phone,email,username,password)values(‘” + txtname.Text + “‘,'” + gen + “‘,'” + txtage.Text + “‘,'” + ddbloodgroup.SelectedItem + “‘,'” + txtstreet1.Text + “‘,'” + txtstreet2.Text + “‘,'” + txtcity.Text + “‘,'” + txtstate.Text + “‘,'” + txtpinno.Text + “‘,'” + txtphoneno.Text + “‘,'” + txtemail.Text + “‘,'” + lblusername.Text + “‘,'” + lblpassword.Text + “‘)”;

cmd = new SqlCommand(query, con);

cmd.ExecuteNonQuery();

con.Close();

lblmes.Visible = true;

lblmes.Text = “Customer Created”;

rdmale.Checked = true;

rdfemale.Checked = false;

txtname.Text = “”;

txtage.Text = “”;

txtstreet1.Text = “”;

txtstreet2.Text = “”;

txtcity.Text = “”;

txtstate.Text = “”;

txtpinno.Text = “”;

txtphoneno.Text = “”;

txtemail.Text = “”;

txtname.Focus();

}

rd.Close();

con.Close();

lblusername.Visible = false;

lblpassword.Visible = false;

}

protected void btncancel_Click(object sender, EventArgs e)

{

txtname.Text = “”;

txtage.Text = “”;

txtstreet1.Text = “”;

txtstreet2.Text = “”;

txtcity.Text = “”;

txtstate.Text = “”;

txtpinno.Text = “”;

txtphoneno.Text = “”;

txtemail.Text = “”;

txtname.Focus();

}

}

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.Configuration;

using System.Data.SqlClient;

public partial class admanagetest : System.Web.UI.Page

{

SqlConnection con;

SqlCommand cmd;

string query;

public void data()

{

string connstring = WebConfigurationManager.ConnectionStrings[“connection”].ConnectionString;

con = new SqlConnection(connstring);

con.Open();

}

protected void Page_Load(object sender, EventArgs e)

{

}

protected void DropDownList1_SelectedIndexChanged(object sender, EventArgs e)

{

GridView1.DataBind();

}

protected void GridView1_RowUpdating(object sender, GridViewUpdateEventArgs e)

{

TextBox subtest = (TextBox)GridView1.Rows[e.RowIndex].FindControl(“txtsubtestname”);

TextBox frmrange = (TextBox)GridView1.Rows[e.RowIndex].FindControl(“txtfrmrange”);

TextBox frmmeasurement = (TextBox)GridView1.Rows[e.RowIndex].FindControl(“txtfrmmeasurement”);

TextBox torange = (TextBox)GridView1.Rows[e.RowIndex].FindControl(“txttorange”);

TextBox tomeasurement = (TextBox)GridView1.Rows[e.RowIndex].FindControl(“txttomeasurement”);

string subtestname = GridView1.DataKeys[e.RowIndex].Values[0].ToString();

data();

query = “update newtesttbl set subtestname='” + subtest.Text + “‘, frmrange='” + frmrange.Text + “‘,fmeasurement='” + frmmeasurement.Text + “‘,torange='” + torange.Text + “‘,tmeasurement='” + tomeasurement.Text + “‘ where subtestname ='” + subtestname.ToString() + “‘”;

SqlDataSource2.UpdateCommand = query;

SqlDataSource2.Update();

con.Close();

GridView1.DataBind();

}

protected void GridView1_RowDeleting(object sender, GridViewDeleteEventArgs e)

{

string subtestname = GridView1.DataKeys[e.RowIndex].Values[0].ToString();

data();

query = “delete from newtesttbl where subtestname ='” + subtestname.ToString() + “‘”;

SqlDataSource2.DeleteCommand = query;

SqlDataSource2.Delete();

GridView1.DataBind();

}

}

Bio informatics lab data analysis system

Universal Assignment (October 13, 2024) Bioinformatics lab data analysis system. Retrieved from https://universalassignment.com/bioinformatics-lab-data-analysis-system/.
"Bioinformatics lab data analysis system." Universal Assignment - October 13, 2024, https://universalassignment.com/bioinformatics-lab-data-analysis-system/
Universal Assignment June 2, 2022 Bioinformatics lab data analysis system., viewed October 13, 2024,<https://universalassignment.com/bioinformatics-lab-data-analysis-system/>
Universal Assignment - Bioinformatics lab data analysis system. [Internet]. [Accessed October 13, 2024]. Available from: https://universalassignment.com/bioinformatics-lab-data-analysis-system/
"Bioinformatics lab data analysis system." Universal Assignment - Accessed October 13, 2024. https://universalassignment.com/bioinformatics-lab-data-analysis-system/
"Bioinformatics lab data analysis system." Universal Assignment [Online]. Available: https://universalassignment.com/bioinformatics-lab-data-analysis-system/. [Accessed: October 13, 2024]

Please note along with our service, we will provide you with the following deliverables:

Please do not hesitate to put forward any queries regarding the service provision.

We look forward to having you on board with us.

Most Frequent Questions & Answers

Universal Assignment Services is the best place to get help in your all kind of assignment help. We have 172+ experts available, who can help you to get HD+ grades. We also provide Free Plag report, Free Revisions,Best Price in the industry guaranteed.

We provide all kinds of assignmednt help, Report writing, Essay Writing, Dissertations, Thesis writing, Research Proposal, Research Report, Home work help, Question Answers help, Case studies, mathematical and Statistical tasks, Website development, Android application, Resume/CV writing, SOP(Statement of Purpose) Writing, Blog/Article, Poster making and so on.

We are available round the clock, 24X7, 365 days. You can appach us to our Whatsapp number +1 (613)778 8542 or email to info@universalassignment.com . We provide Free revision policy, if you need and revisions to be done on the task, we will do the same for you as soon as possible.

We provide services mainly to all major institutes and Universities in Australia, Canada, China, Malaysia, India, South Africa, New Zealand, Singapore, the United Arab Emirates, the United Kingdom, and the United States.

We provide lucrative discounts from 28% to 70% as per the wordcount, Technicality, Deadline and the number of your previous assignments done with us.

After your assignment request our team will check and update you the best suitable service for you alongwith the charges for the task. After confirmation and payment team will start the work and provide the task as per the deadline.

Yes, we will provide Plagirism free task and a free turnitin report along with the task without any extra cost.

No, if the main requirement is same, you don’t have to pay any additional amount. But it there is a additional requirement, then you have to pay the balance amount in order to get the revised solution.

The Fees are as minimum as $10 per page(1 page=250 words) and in case of a big task, we provide huge discounts.

We accept all the major Credit and Debit Cards for the payment. We do accept Paypal also.

Popular Assignments

ECON20001 Assignment #2

Assignment #2 Due Monday September 30th 2pm AEST The assignment is marked out of 25 points. The weight for each part is indicated following the question text. Style requirements: This assignment requires the submission of a spreadsheet. Please keep THREE decimal places in your answers and include your spreadsheet as

Read More »

RES800 Assessment 1 – Research Question and Literature Review

Subject Title Business Research Subject Code RES800 Assessment Title Assessment 1 – Research Question and Literature Review Learning Outcome/s     Utilise critical thinking to analyse managerial problems and formulate relevant research questions and a research design   Apply research theories and methodologies to assist in developing a business research

Read More »

Assessment Task 2 Health advocacy and communication plan

Assessment Task 2 Health advocacy and communication plan Rationale and multimedia plan presentation Submission requirements Due date and time:         Rationale: 8pm AEST Monday 23 September 2024 (Week 11) Multimedia plan presentation: 8pm AEST Monday 30 September 2024 (Study Period) % of final grade:         50% of overall grade Word limit: Time

Read More »

MLI500 Leadership and innovation Assessment 1

Subject Title Leadership and innovation Subject Code MLI500 Assessment Assessment 1: Leadership development plan Individual/Group Individual Length 1500 words Learning Outcomes LO1 Examine the role of leaders in fostering creativity and innovation LO5 Reflect on and take responsibility for their own learning and leadership development processes Submission   Weighting 30%

Read More »

FPC006 Taxation for Financial Planning

Assignment 2 Instructions Assignment marks: 95 | Referencing and presentation: 5 Total marks: 100 Total word limit: 3,000 words Weighting: 40% Download and use the Assignment 2 Answer Template provided in KapLearn to complete your assignment. Your assignment should be loaded into KapLearn by 11.30 pm AEST/AEDT on the wdue

Read More »

TCHR5001 Assessment Brief 1

TCHR5001 Assessment Brief 1 Assessment Details Item Assessment 1: Pitch your pedagogy Type Digital Presentation (Recorded) Due Monday, 16th September 2024, 11:59 pm AEST (start of Week 4) Group type Individual Length 10 minutes (equivalent to 1500 words) Weight 50% Gen AI use Permitted, restrictions apply Aligned ULOS ULO1, ULO2,

Read More »

HSH725 Assessment Task 2

turquoise By changing the Heading 3 above with the following teal, turquoise, orange or pink you can change the colour theme of your CloudFirst CloudDeakin template page. When this page is published the Heading 3 above will be removed, but it will still be here in edit mode if you wish to change the colour theme.

Read More »

Evidence in Health Assessment 2: Evidence Selection

Evidence in Health Assessment 2: Evidence Selection Student name:                                                                    Student ID: Section 1: PICO and search strategy Evidence Question: Insert evidence question from chosen scenario here including all key PICO terms.       PICO Search Terms                                                                                                                                                                                                          Complete the following table.   Subject headings Keywords Synonyms Population  

Read More »

Assessment 1 – Lesson Plan and annotation

ASSESSMENT TASK INFORMATION: XNB390 Assessment 1 – Lesson Plan and annotation This document provides you with information about the requirements for your assessment. Detailed instructions and resources are included for completing the task. The Criterion Reference Assessment (CRA) Marking Matrix that XNB390 markers will use to grade the assessment task

Read More »

XNB390 Task 1 – Professional Lesson Plan

XNB390 Template for Task 1 – Professional Lesson Plan CONTEXT FOR LESSON: SOCIAL JUSTICE CONSIDERATIONS: Equity Diversity Supportive Environment UNIT TITLE:    TERM WEEK DAY TIME 1   5           YEAR/CLASS STUDENT NUMBERS/CONTEXT LOCATION LESSON DURATION         28 Children (chl): 16 boys; 12

Read More »

A2 Critical Review Assignment

YouthSolutions Summary The summary should summarise the key points of the critical review. It should state the aims/purpose of the program and give an overview of the program or strategy you have chosen. This should be 200 words – included in the word count. Critical analysis and evaluation Your critical

Read More »

PUN364 – Workplace activity Assignment

Assessment 1 – DetailsOverviewFor those of you attending the on-campus workshop, you will prepare a report on the simulated simulated inspection below. For those of you who are not attending, you will be required to carry out your own food business inspection under the supervision of a suitably qualified Environmental

Read More »

FPC006 Taxation for Financial Planning

Assignment 1 Instructions Assignment marks: 95 | Referencing and presentation: 5 Total marks: 100 Total word limit: 3,600 words Weighting: 40% Download and use the Assignment 1 Answer Template provided in KapLearn to complete your assignment. Your assignment should be loaded into KapLearn by 11.30 pm AEST/AEDT on the due

Read More »

Mental health Nursing assignment

Due Aug 31 This is based on a Mental health Nursing assignment Used Microsoft word The family genogram is a useful tool for the assessment of individuals, couples, and families.  It can yield significant data and lead to important, new patient understandings and insights as multigenerational patterns take shape and

Read More »

Assessment 2: Research and Policy Review

Length: 2000 words +/- 10% (excluding references)For this assessment, you must choose eight sources (academic readings and policy documents) as the basis of your Research and Policy Review. You must choose your set of sources from the ‘REFERENCES MENU’ on the moodle site, noting the minimum number of sources required

Read More »

HSN702 – Lifespan Nutrition

Assessment Task: 2 Assignment title: Population Nutrition Report and Reflection Assignment task type: Written report, reflection, and short oral presentation Task details The primary focus of this assignment is on population nutrition. Nutritionists play an important role in promoting population health through optimal nutritional intake. You will be asked to

Read More »

Written Assessment 1: Case Study

Billy a 32-year-old male was admitted to the intensive care unit (ICU) with a suspected overdose of tricyclic antidepressants. He is obese (weight 160kg, height 172cm) and has a history of depression and chronic back pain for which he takes oxycodone. On admission to the emergency department, Paramedics were maintaining

Read More »

Assessment Task 8 – Plan and prepare to assess competence

Assessment Task 8 – Plan and prepare to assess competence Assessment Task 8 consists of the following sections: Section 1:      Short answer questions Section 2:      Analyse an assessment tool Section 3:      Determine reasonable adjustment and customisation of assessment process Section 4:      Develop an assessment plan Student Instructions To complete this

Read More »

Nutrition Reviews Assignment 2 – Part A and Part B

This assignment provides you with the opportunity to determine an important research question that is crucial to address based on your reading of one of the two systematic reviews below (Part A). You will then develop a research proposal outlining the study design and methodology needed to answer that question

Read More »

NUR332 – TASK 3 – WRITTEN ASSIGNMENT

NUR332 – TASK 3 – WRITTEN ASSIGNMENT for S2 2024. DESCRIPTION (For this Task 3, the word ‘Indigenous Australians’, refers to the Aboriginal and Torres Strait Islander Peoples of Australia) NUR332 Task 3 – Written Assignment – Due – WEEK 12 – via CANVAS on Wednesday, Midday (1200hrs) 16/10/2024. The

Read More »

NUR100 Task 3 – Case study

NUR100 Task 3 – Case study To identify a key child health issue and discuss this issue in the Australian context. You will demonstrate understanding of contemporary families in Australia. You will discuss the role of the family and reflect on how the family can influence the overall health outcomes

Read More »

NUR 100 Task 2 Health Promotion Poster

NUR 100 Task 2 Health Promotion Poster The weighting for this assessment is 40%. Task instructions You are not permitted to use generative AI tools in this task. Use of AI in this task constitutes student misconduct and is considered contract cheating. This assessment requires you to develop scholarship and

Read More »

BMS 291 Pathophysiology and Pharmacology CASE STUDY

BMS 291 Pathophysiology and Pharmacology CASE STUDY Assessment No: 1 Weighting: 40% Due date Part A: midnight Friday 2nd August 2024 Due date Part B: midnight Sunday 29th September 2024 General information In this assessment, you will develop your skills for analysing, integrating and presenting information for effective evidence-based communication.

Read More »

Assessment Task: Health service delivery

Assessment Task Health service delivery is inherently unpredictable. This unpredictability can arise from, for example, the assortment of patient presentations, environmental factors, changing technologies, shifts in health policy and changes in division leadership. It can also arise from changes in policy within an organisation and/or associated health services that impact

Read More »

LNDN08002 Business Cultures Resit Assessment

LNDN08002 Business Cultures Resit Assessment Briefing 2023–2024 (Resit for Term 1) Contents Before starting this resit, please: 1 Assessment Element 1: Individual Report 1 Case Report Marking Criteria. 3 Assessment Element 2: Continuing Personal Development (CPD) 4 Guidance for Assessment 2: Reflection and Reflective Practice. 5 Student Marking Criteria –

Read More »

Assessment Task 2 – NAPLAN Exercise

Assessment Task 2 (35%) – Evaluation and discussion of test items Assessment Task 2 (35%) – Evaluation and discussion of test items AITSL Standards: This assessmeAITSL Standards: This assessment provides the opportunity to develop evidence that demonstrates these Standards: 1.2        Understand how students learn 1.5        Differentiate teaching to meet with

Read More »

EBY014 Degree Tutor Group 2 Assignment

  Assignment Brief Module Degree Tutor Group 2 Module Code EBY014 Programme BA (Hons) Business and Management with   Foundation Year Academic Year 2024/2025 Issue Date 6th May 2024 Semester Component Magnitude Weighting Deadline Learning outcomes assessed 2 1 2000 words Capstone Assessment 100% 26th July, 2024 1/2/3/4 Module Curriculum

Read More »

Can't Find Your Assignment?

Open chat
1
Free Assistance
Universal Assignment
Hello 👋
How can we help you?