# NeuroShell Classifier

NeuroShell Classifier | $595.00 |

Use advanced AI techniques to solve classification and categorization problems.

**DOWNLOADABLE PRODUCT: Note that this product is available only by download. Upon placing an order, you will receive download instructions within 1 business day.**

**Wide range of busines uses including** target market selection; sales prospect selection; oil exploration sites; employee selection; help desk applications; managerial decision making; audit target selection; security risk profiling; credit application processing; legal strategies; factory and shop problem analysis; personnel profiling; locating tax evaders; college application screening; employee retention; damaged product identification; alarm system malfunction diagnosis; quality control.

The NeuroShell Classifier solves classification and categorization problems based on patterns learned from historical data. The Classifier produces outputs which are the probabilities of the input pattern belonging to each of several categories. Examples of categories include {acidic, neutral, alkaline}, {buy, sell, hold}, and {cancer, benign}.

The classification algorithms (one is a new neural network and the other is a statistical classifier driven by a genetic algorithm) are the crowning achievement of several years of research. These algorithms have been optimized to solve classification problems. Gone are the days of dozens of parameters that must be artistically set to create a good model without over fitting. Gone are the days of hiring a neural net expert or a statistician to build your classification models.

*The NeuroShell Classifier allows you to build powerful classification models quickly.*

Statistical tools such as an agreement matrix (sensitivity and specificity), probability graphs, ROC curves, and input rankings assist in analyzing the effectiveness of your model.

Two of the most commonly heard complaints about previous classification systems, aside from being too hard to use, are that they are too slow or that they do not accurately tell you how important each of the variables is to the model. We've taken care of those problems. That's why we have two training models from which to choose.

The first model called TurboProp2 dynamically grows hidden neurons and trains very fast. TurboProp2 models are built (trained) in 10 to 30 seconds on a 200 MHz Pentium (a matter of seconds on newer computers), compared to hours for older neural networks types.

The genetic training method takes a little longer to train but reveals the relative importance of each of your inputs. You will know which data you don't have to collect anymore! The genetic training method also trains everything in an out-of-sample mode; it is essentially doing a "one-hold-out" technique, also called "jackknife" or "cross validation". If you train using this method, you are essentially looking at the training set out-of-sample. This method is therefore extremely effective when you do not have many patterns on which to train.

The NeuroShell Classifier facilitates integration with other programs, because it uses standard text files. These files are easily imported/exported from spreadsheet programs such as Excel and Lotus®, for example. The NeuroShell Classifier is so easy to use that it doesn't need a manual! Instead, there is an "Instructor" that guides you through making the classification models. At every stage of the Instructor, our extensive help file will give you all the information you need. When you have learned from the Instructor, you can turn it off and work from the toolbar or menus. (The program does include an on-line manual that you may print yourself or just browse from your computer.)

*The NeuroShell Classifier displays an ROC curve to help you analyze the effectiveness of your model.*

Finally, for those who want to embed the resulting neural models into your own programs, or to distribute the results, there is an optional Run-Time Server available. Classifier models may be distributed in your programs without incurring royalties or other fees.

*The Neuroshell Classifier shows you the estimated relative importance of each variable in the model.*

## Specifications

### Software Requirements

The NeuroShell Classifier is a 32-bit program that requires Microsoft® Windows® 95, 98, Windows 2000, XP®, or Windows NT®(SP3 or higher). It will not run with Windows 3.1.

### Hardware Requirements

IBM® PC or compatible computer with a 486 or higher processor and 16 megabytes of RAM.

### Limits

- 150 input variables and one output variable (with multiple categories).
- 16,000 rows of data (example patterns).

*Note: These limits are not inhibitive as they may seem for owners of large databases. Call for explanation.*

### Files

ASCII text files separated by commas, spaces, tabs, or semicolons. If your data is in a spreadsheet, simply save it as a .CSV file.

### Speed

Neural nets train very fast, usually in under a minute.

The Genetic method will train very slowly on large files. This method may be more suitable for less than 3,000 rows of data.

### Statistics and Graphics

- Number of correct and incorrect classifications.
- Actual vs Predicted.
- Receiver Operating Characteristic (ROC) curves.
- Agreement matrix (sometimes called confusion matrix or contingency table).

### Methodology

There are two neural network paradigms. One is a proprietary algorithm called TurboProp(TM) 2, which is NOT based on the old backpropagation algorithm. Another paradigm in the software uses an advanced variant of Probabalistic Neural Nets (PNN).

## Features

### Ability to Select Level of Generalization in Neural Training Strategy

Our neural method can actually be changed after it is trained so that it provides more or less generalization. Pressing the Advanced Button will allow you to select the level of generalization from 0% (No Enhanced Generalization) to 100% (Over Generalization). A setting of 50% is equivalent to Enhanced Generalization. The default value, when the Enhanced Generalization button is checked, is 50%.

### Maximum number of hidden neurons for Neural Training Strategy

You may set the number of hidden neurons to a maximum of 150 when using the Neural Training Strategy. This gives you some control over how the neural net fits data. You may even specify zero hidden neurons for a linear model.

### Maximum number of generations without improvement in Genetic Training Strategy

You may set the maximum number of generations without improvement that the algorithm will train on. The number of generations may be set between 10 and 1000 (integers only). This will allow you to control the length of training time.

### Fitness Coefficient Matrix

When using the genetic training strategy, the user has the option to change the goal of genetic optimization. The goals are to minimize the total number of incorrect classifications, minimize the average percentage of incorrect classifications over all categories and to maximize the custom fitness function built with the user defined Fitness Coefficients Matrix. For example, a physician may want a model that minimizes false negatives rather than treating all wrong answers the same.

### Agreement Matrix (Contingency Table) Statistics

The agreement matrix statistics related to the comparison of the actual and predicted classifications include the following:

The category under consideration is **positive**.

- True-pos. ratio (True-Positive Ratio, also known as Sensitivity) is equal to the number of patterns classified as positive by the network that were confirmed to be positive, divided by the total number of confirmed positive patterns. It is also equal to one minus the False-Negative ratio.
- False-pos. ratio (False-Positive Ratio) is equal to the number of patterns classified as positive by the network that were confirmed to be negative, divided by the total number of confirmed negative patterns. It is also equal to one minus the True-Negative ratio.
- True-neg. ratio (True-Negative Ratio also known as Specificity) is equal to the number of patterns classified as negative by the network that were confirmed to be negative, divided by the total number of confirmed negative patterns. It is also equal to one minus the False-Positive ratio.
- False-neg ratio (False-Negative Ratio) is equal to the number of patterns classified as negative by the network that were confirmed to be positive, divided by the total number of confirmed positive patterns. It is also equal to one minus the True-Positive ratio.

When the category under consideration is **negative**, the terms are reversed.

### Free Technical Support

## Reviews/Testimonials

**David Ellis, The Mining Company's AI guide: **"Until NeuroShell Predictor/Classifier, I had not seen a general purpose neural net program usable by just about anybody. Others I have seen required a deep understanding of neural net technology; NeuroShell does not. I commend it to your attention."

**ORMS Today:** "AI Trilogy is an excellent package for those who are interested in getting the results of neural networks without worrying about how the neural network works or having to make adjustments to them."

**Futures Magazine: **"Not that many years ago, neural networks and related artificial intelligence technologies were state-of-the-art, required the education of a rocket scientist along with programming experience as well as big mainframe computers that couldn't be bought for money. Today - largely thanks to Ward Systems Group - the latest neural network technology can be understood and operated by any computer-literate person, at a cost less than a nice 'night on the town' for two. ... It has to be said: It can't get any easier than this."

**Patrick K. Simpson, Former President IEEE Neural Networks Council: **"You continue to have the best classifier and prediction software in the market. Please feel free to let others know I think so."

**Part of line of software whose customers include **3M, AAA, AT&T, Bank of America, BMW North America, Chase Manhattan Bank, Citigroup, Coca Cola, Dell Computer, Delta Airlines, Exxon/Mobil, Federal Reserve Board, General Electric, Hewlett Packard, IBM, Merril Lynch, Microsoft, MIT, NASA, Pfizer, Procter and Gamble, Prudential Securities, Shell Oil and many more.

*Text supplied with permission from Ward Systems Group Web Site*

**Save $390 (over 20%) when you purchase NeuroShell Classifier as a part of the AI Trilogy!**

**DOWNLOADABLE PRODUCT: Note that this product is available only by download. Upon placing an order, you will receive download instructions within 1 business day.**