Pattern recognition: Neural networks are well suited to problems when the aim is to detect, interpret, and classify features or patterns within a dataset. This includes everything from identifying objects (like faces) in images, to optical character recognition, to more complex tasks like gesture recognition.
Time-series prediction and anomaly detection: Neural networks are utilized both in forecasting, such as predicting stock market trends or weather patterns, and in recognizing anomalies, which can be applied to areas like cyberattack detection and fraud prevention.
Natural language processing (NLP): One of the biggest developments in recent years has been the use of neural networks for processing and understanding human language. They’re used in various tasks including machine translation, sentiment analysis, and text summarization, and are the underlying technology behind many digital assistants and chatbots.
-
Signal processing and soft sensors: Neural networks play a crucial role in devices like cochlear implants and hearing aids by filtering noise and amplifying essential sounds. They’re also involved in soft sensors, software systems that process data from multiple sources to give a comprehensive analysis of the environment.
+
+
+
+
Signal processing and soft sensors: Neural networks play a crucial role in devices like cochlear implants and hearing aids by filtering noise and amplifying essential sounds. They’re also involved in soft sensors, software systems that process data from multiple sources to give a comprehensive analysis of the environment.
+
+
+
Control and adaptive decision-making systems: These applications range from autonomous vehicles like self-driving cars and drones to adaptive decision-making used in game playing, pricing models, and recommendation systems on media platforms.
Generative models: The rise of novel neural network architectures has made it possible to generate new content. These systems can synthesize images, enhance image resolution, transfer style between images, and even generate music and video.
@@ -114,28 +120,30 @@
-
Each input needs to be multiplied by its corresponding weight:
-
-
-
-
Phrase
-
Phrase
-
Input \boldsymbol{\times} Weight
-
-
-
-
-
12
-
0.5
-
6
-
-
-
4
-
–1
-
–4
-
-
-
+
+
Each input needs to be multiplied by its corresponding weight:
+
+
+
+
Phrase
+
Phrase
+
Input \boldsymbol{\times} Weight
+
+
+
+
+
12
+
0.5
+
6
+
+
+
4
+
–1
+
–4
+
+
+
+
Step 2: Sum the Inputs
The weighted inputs are then added together:
6 + -4 = 2
@@ -262,7 +270,7 @@ function activate(sum) {
Here’s the code to generate a guess:
// Create the perceptron.
let perceptron = new Perceptron(3);
-// The input is three values: x, y, and the bias.
+// The input is three values: x, y, and the bias.
let inputs = [50, -12, 1];
// The answer!
let guess = perceptron.feedForward(inputs);
@@ -521,17 +529,17 @@ function draw() {
Figure 10.10: Data points that are linearly separable (left) and data points that are nonlinearly separable, as a curve is required to separate the points (right)
Now imagine you’re classifying plants according to soil acidity (x-axis) and temperature (y-axis). Some plants might thrive in acidic soils but only within a narrow temperature range, while other plants prefer less acidic soils but tolerate a broader range of temperatures. A more complex relationship exists between the two variables, so a straight line can’t be drawn to separate the two categories of plants, acidophilic and alkaliphilic (see Figure 10.10, right). A lone perceptron can’t handle this type of nonlinearly separable problem. (Caveat here: I’m making up these scenarios. If you happen to be a botanist, please let me know if I’m anywhere close to reality.)
+
One of the simplest examples of a nonlinearly separable problem is XOR (exclusive or). This is a logical operator, similar to the more familiar AND and OR. For A AND B to be true, both A and B must be true. With OR, either A or B (or both) can be true. These are both linearly separable problems. The truth tables in Figure 10.11 show their solution space. Each true or false value in the table shows the output for a particular combination of true or false inputs. See how you can draw a straight line to separate the true outputs from the false ones?
-
One of the simplest examples of a nonlinearly separable problem is XOR (exclusive or). This is a logical operator, similar to the more familiar AND and OR. For A AND B to be true, both A and B must be true. With OR, either A or B (or both) can be true. These are both linearly separable problems. The truth tables in Figure 10.11 show their solution space. Each true or false value in the table shows the output for a particular combination of true or false inputs. See how you can draw a straight line to separate the true outputs from the false ones?
The XOR operator is the equivalent of (OR) AND (NOT AND). In other words, A XOR B evaluates to true only if one of the inputs is true. If both inputs are false or both are true, the output is false. To illustrate, let’s say you’re having pizza for dinner. You love pineapple on pizza, and you love mushrooms on pizza, but put them together—yech! And plain pizza, that’s no good either!
+
The XOR truth table in Figure 10.12 isn’t linearly separable. Try to draw a straight line to separate the true outputs from the false ones—you can’t!
-
The XOR truth table in Figure 10.12 isn’t linearly separable. Try to draw a straight line to separate the true outputs from the false ones—you can’t!
The fact that a perceptron can’t even solve something as simple as XOR may seem extremely limiting. But what if I made a network out of two perceptrons? If one perceptron can solve the linearly separable OR and one perceptron can solve the linearly separate NOT AND, then two perceptrons combined can solve the nonlinearly separable XOR.
When you combine multiple perceptrons, you get a multilayered perceptron, a network of many neurons (see Figure 10.13). Some are input neurons and receive the initial inputs, some are part of what’s called a hidden layer (as they’re connected to neither the inputs nor the outputs of the network directly), and then there are the output neurons, from which the results are read.