Become an Unconventional Innovator
In corporate projects, there always comes the point when a problem that seems impossible to solve hits you. At that point, you try everything you learned, but it doesn't work for what's asked of you. Your team or customer begins to look elsewhere. It's time to react.
In this chapter, an impossible-to-solve business case regarding material optimization will be implemented successfully with an example of a feedforward neural network (FNN) with backpropagation.
Feedforward networks are the building blocks of deep learning. The battle around the XOR function perfectly illustrates how deep learning regained popularity in corporate environments. The XOR FNN illustrates one of the critical functions of neural networks: classification. Once information becomes classified into subsets, it opens the doors to prediction and many other functions of neural networks, such as representation learning.
An XOR FNN network will be built from scratch to demystify deep learning from the start. A vintage, start-from-scratch method will be applied, blowing the deep learning hype off the table.
The following topics will be covered in this chapter:
- How to hand build an FNN
- Solving XOR with an FNN
- Classification
- Backpropagation
- A cost function
- Cost function optimization
- Error loss
- Convergence