Design and train a neural network to accomplish some classification task. | Homework Help
For this milestone, you will design and train a neural network to accomplish some classification task.
Choose a data set
The UCI Machine Learning Archive hosts various data sets suitable for testing learning algorithms. I suggest clicking on View ALL Data Sets on the right side of the page. That provides a nice interface in which you can filter by data type or area of interest.
The data should be suitable for a classification task, not clustering, recommendations, or regression. Neural networks support both categorical and numerical data, youll just want to keep the number of attributes to less than 100, because well have to tune the way each attribute is presented to the network.
When you click on the data set, youll see a description, citations, and details about the attributes. There are links near the top to the Data Folder, and there youll find a list of files ending in .data
(the raw data) or .names
(attribute descriptions).
Download the data and descriptions. I have a lot of experience with the Mushroom data, so Ill explore that in this explanation but you can choose something else for your project. For mushrooms, the .names
file contains:
1. Title: Mushroom Database
2. Sources:
(a) Mushroom records drawn from The Audubon Society Field Guide to North
American Mushrooms (1981). G. H. Lincoff (Pres.), New York: Alfred
A. Knopf
(b) Donor: Jeff Schlimmer (Jeffrey.Schlimmer@a.gp.cs.cmu.edu)
(c) Date: 27 April 1987
3. Past Usage:
1. Schlimmer,J.S. (1987). Concept Acquisition Through Representational
Adjustment (Technical Report 87-19). Doctoral disseration, Department
of Information and Computer Science, University of California, Irvine.
--- STAGGER: asymptoted to 95% classification accuracy after reviewing
1000 instances.
[etc.]
5. Number of Instances: 8124
6. Number of Attributes: 22 (all nominally valued)
7. Attribute Information: (classes: edible=e, poisonous=p)
1. cap-shape: bell=b,conical=c,convex=x,flat=f,
knobbed=k,sunken=s
2. cap-surface: fibrous=f,grooves=g,scaly=y,smooth=s
3. cap-color: brown=n,buff=b,cinnamon=c,gray=g,green=r,
pink=p,purple=u,red=e,white=w,yellow=y
[etc.]
The .data
file is a text file with comma-separated values (CSV), which can be imported easily into Excel or other spreadsheet applications:
p,x,s,n,t,p,f,c,n,k,e,e,s,s,w,w,p,w,o,p,k,s,u
e,x,s,y,t,a,f,c,b,k,e,c,s,s,w,w,p,w,o,p,n,n,g
e,b,s,w,t,l,f,c,b,n,e,c,s,s,w,w,p,w,o,p,n,n,m
p,x,y,w,t,p,f,c,n,n,e,e,s,s,w,w,p,w,o,p,k,s,u
e,x,s,g,f,n,f,w,b,k,t,e,s,s,w,w,p,w,o,e,n,a,g
[etc.]
Design your network
Your next task is to design your neural network architecture: how many neurons in each layer, and how to map neuronal activations to and from the data set?
Input layer
The number of input neurons will be based on the number of attributes in your data set, but it may not be a one-to-one match.
Generally, a continuous (real number) attribute can map directly to one neuron. There are no continuous attributes in the mushroom set, but the Heart Disease data contains a few, such as:
thalach: maximum heart rate achieved
which has values like 127
, 154
, or 166
. It is helpful, however, to normalize these values to the range 0..1, so they are not terribly out of proportion to the inputs from other attributes. In the case of heart rate, we would find the minimum (60) and the maximum (182) in the data file. Then, to convert any value, we subtract the minimum and divide by the size of the range (182-60 = 122):
Raw value Normalized value
60 0.0 = (60-60)/122
127 0.549180327869 = (127-60)/122
154 0.770491803279 = (154-60)/122
166 0.868852459016 = (166-60)/122
182 1.0 = (182-60)/122
A discrete (categorical) attribute must be translated in some way, usually using a binary encoding. Lets take the cap-shape
of mushrooms as an example. These are the possible values:
bell=b, conical=c, convex=x, flat=f, knobbed=k, sunken=s
Because there are 6 possible values, we can represent them in ?log2(6)?=3?log2(6)?=3 input neurons, like this:
Code Category # Binary Input[0] Input[1] Input[2]
b bell 0 000 0.0 0.0 0.0
c conical 1 001 0.0 0.0 1.0
x convex 2 010 0.0 1.0 0.0
f flat 3 011 0.0 1.0 1.0
k knobbed 4 100 1.0 0.0 0.0
s sunken 5 101 1.0 0.0 1.0
Work through the attribute descriptions for your data set to determine the number of input neurons, the normalization parameters for continuous attributes, and the binary encoding for discrete attributes.
HiIDen layer
You will have to decide how many neurons to use in the hiIDen layer. Too few, and the network will not be sophisticated enough to recognize the patterns in the data. Too many, and the network may take longer to converge on an acceptable solution.
I would recommend starting with the same number of hiIDen neurons as input neurons, and then experiment with reducing it.
Output layer
Most classifications will be discrete categories: poisonous/edible for mushrooms, or the diagnosis of heart disease in that data set:
num: diagnosis of heart disease (angiographic disease status)
-- Value 0: < 50% diameter narrowing
-- Value 1: > 50% diameter narrowing
You will want to have one output neuron for each possible classification, and use the winner take all strategy the neuron with the highest activation determines the result. Here would be the expected outputs for the categories of mushrooms:
Code Category Output[0] Output[1]
e edible 1.0 0.0
p poison 0.0 1.0
Implementation
Network architecture
Start with your (or my) mazur-nn
implementation, but youll have to redefine the network architecture, something like this:
// 3-layer network architecture
const int NUM_INPUTS = 57;
const int NUM_HIIDEN = 10;
const int NUM_OUTPUTS = 2;
Those are the numbers I used for the mushroom data, but you can alter them for your own network.
Input data
Next, youll need to read the data. Heres a routine you can use that interprets the comma-separated values (CSV) format generally used by data sets in the UCI archive:
#include <cstdio>
#include <cstdlib>
#include <iostream>
#include <cstring>
void read_csv(const char* filename,
vector< vector<string> >& data)
{
const int BUFFER_SIZE = 8192;
char buffer[BUFFER_SIZE];
FILE* fp = fopen(filename, "r");
if(!fp) { perror(filename); exit(1); }
// Read each line of the file
while(fgets(buffer, BUFFER_SIZE-1, fp)) {
// Parse by splitting on commas
vector<string> row;
char* elt = strtok(buffer, ",");
while(elt) {
row.push_back(elt);
elt = strtok(NULL, ",");
}
data.push_back(row);
}
cout << "Read " << data.size() << " records x "
<< data[0].size() << " attributes from " << filename << "n";
fclose(fp);
}
Youd call it like this:
vector<vector<string>> data;
// Read data from file into two-dimensional vector
read_csv("shrooms.data", data);
If it works, you should see a message like this upon running the program:
Read 8124 records x 23 attributes from shrooms.data
Set target outputs
Next, youll have to modify the parts of the mazur-nn
code that provides inputs to the network, and that specify the target outputs. Lets begin with the target outputs. In the mazur-nn
example, we simply used:
vector<double> targets = {0.01, 0.99};
But now well have to vary that for each example in the data file. So declare it this way:
vector<double> targets (NUM_OUTPUTS);
We use this to implement the winner-take-all strategy. In the mushrooms data file, the edible/poisonous classification is the first (0th) attribute, and its a single character, e or p. When we access row[0]
, we grab the string value of the 0th attribute, and the extra [0]
grabs the first (0th) character of the string.
switch(row[0][0]) {
case 'e':
out[0] = 1;
out[1] = 0;
break;
case 'p':
out[0] = 0;
out[1] = 1;
break;
default:
cout << "Error: unexpected classification " << row[0][0] << "n";
abort();
}
The default
case helps detect potential errors in parsing the file.
Your output interpretation will be similar, but it must be based on the format of your data and the number of output neurons.
Set network inputs discrete
As for the network inputs, heres the code we used in the mazur-nn
example:
// Provide inputs to the network
inputs.at(0) = .05;
inputs.at(1) = .10;
But now we have to interpret the data vector and normalize continuous attributes or binary-encode discrete attributes as described in the Inputs section, above. Heres an example of specifying just the first attribute in the mushroom database:
int i = 0;
// 1. cap-shape: bell=b,conical=c,convex=x,flat=f,
// knobbed=k,sunken=s
switch(row[1][0]) {
case 'b': inputs[i++] = 0; inputs[i++] = 0; inputs[i++] = 0; break;
case 'c': inputs[i++] = 0; inputs[i++] = 0; inputs[i++] = 1; break;
case 'x': inputs[i++] = 0; inputs[i++] = 1; inputs[i++] = 0; break;
case 'f': inputs[i++] = 0; inputs[i++] = 1; inputs[i++] = 1; break;
case 'k': inputs[i++] = 1; inputs[i++] = 0; inputs[i++] = 0; break;
case 's': inputs[i++] = 1; inputs[i++] = 0; inputs[i++] = 1; break;
default:
cout << "Error: unhandled cap-shape " << row[1][0] << "n";
abort();
}
This fragment illustrates converting a discrete value, represented by single characters, into bits in a binary encoding. You can see that the zeroes and ones assigned to inputs[i++]
match the binary encodings in the previous table.
If the discrete values in your data file are not represented by a single character, but rather by a string, then you cannot use switch
/case
statements as above. Instead, you would use a series of if
/else
statements comparing the completes strings, like this:
// Here we're illustrating three strings as the possibilities
// for column K, so we encode them as bits into two inputs.
if(row[K] == "large") {
inputs[i++] = 0; inputs[i++] = 0;
}
else if(row[K] == "medium") {
inputs[i++] = 0; inputs[i++] = 1;
}
else if(row[K] == "small") {
inputs[i++] = 1; inputs[i++] = 0;
}
else {
cout << "Error: unhandled: " << row[K] << "n";
abort();
}
Set network inputs continuous
If your data set has real-valued inputs, this is where you would normalize them to the range 0.01.0. Heres what that looks like for an attribute in column K
:
double numeric_value = atof(row[K].c_str()); // Convert column K of current row
double normalized_value = (numeric_value - MINIMUM) / RANGE;
// (where you plug in MINIMUM and RANGE for the column K)
inputs[i++] = normalized_value;
Experiments
Once you have these input and output functions completed, you can experiment with training the network. Determine how many cycles it takes to converge to a solution with different proportions of training vs. test data. Then experiment with different numbers of hiIDen neurons.
We've got everything to become your favourite writing service
Money back guarantee
Your money is safe. Even if we fail to satisfy your expectations, you can always request a refund and get your money back.
Confidentiality
We don’t share your private information with anyone. What happens on our website stays on our website.
Our service is legit
We provide you with a sample paper on the topic you need, and this kind of academic assistance is perfectly legitimate.
Get a plagiarism-free paper
We check every paper with our plagiarism-detection software, so you get a unique paper written for your particular purposes.
We can help with urgent tasks
Need a paper tomorrow? We can write it even while you’re sleeping. Place an order now and get your paper in 8 hours.
Pay a fair price
Our prices depend on urgency. If you want a cheap essay, place your order in advance. Our prices start from $11 per page.