Devember 18 - Recognizing hand-written digits using Neural Network in python

I, Wasabi-Peas, will participate to the next Devember. My Devember will be insert objective. I promise I will program for my Devember for at least an hour, every day of the next December. I will also write a daily public devlog and will make the produced code publicly available on the internet. No matter what, I will keep my promise.

For my devember challenge, I am going to work on developing a neural network from scratch to recognize hand written digits. This project will be written in python using only numpy / scipy / matplotlib (maybe another package that convert an input image into other form of data).

6 Likes

Can you document your learning curve? It doesnt have to be much, like one line diary entry for how you today tried to figure out how to start or find anything at all

Sure and I hope you don’t mind I have more than one sentence :slight_smile:

I have some basic understanding of neutral network. It is a discrete classifier with multi-layers of nodes each of which has a hypothesis of a sigmoid function. I am also currently working on a homework problem from a ML online course to build a NN using matlab. Translating my solution to python shouldn’t be too bad - my preferred language is python. The main thing I would need to figure out how to convert a hand-written image into a table of data points because I haven’t done that part before using python.

So in short, here is a list of things I would need to do:

  • find some public data set of hand written digits (e.g. http://yann.lecun.com/exdb/mnist/)
  • convert an image into some python readable data
  • write a/several class that handle each node / layer and do the minimization
  • study the performance / tweak around

Hmm… this does sound a lot now. But we will see how much I can accomplish :smiley:

Hmmm, maybe

https://www.tensorflow.org/tutorials/ ?
https://www.kaggle.com/datasets?search=Digits ?

Also, you might want to look into numpy if you’re starting data science work in Python ?

My goal is to not use tensorflow or sklearn :slight_smile: simply want to test my logic and coding skills. and yes! starting kaggle is my goal after this devember. I have been learning more about data science since I graduated. And I think its almost time to get a taste of kaggle :smiley:

And yes to numpy. Everyone should know numpy and scipy. I used it a lot for my thesis work! and finally got to meet the founder of numpy at pydata a few weeks ago at DC - which was unreal!

Been eyeing a 2070 or 2080 for her to have access to a some Tensor Cores since she only has a 970 or 1060 right now. Not sure its worth it yet tho, once am4 zen2 stuff comes out she will be moving to my 1950x rig so she will have plenty of cpu.

1 Like

This is over my head, but fascinating. Can’t wait to see further updates!

3 Likes

If at any point in the future you are able to convert hand written equations into LaTeX please let me know. :grin:

Cool project, I will definitely follow this. :+1:

1 Like

Finally got the time to get started!

Its pretty empty right now, and it is my habits to just write random scripts in a scratch area to start things. But I have figured out how to load the training sets into a regular numpy array (scratches/dataset_loader.py) ! Will work on visualizing the training set tomorrow.

1 Like

That’s good. Progress is progress!

1 Like

Do you guys update progress at your own thread ? or we shouldn’t post progress every day ?
I am a relatively new newbie, so not sure how things are done around here…

But I am able to display some of the training sample! I didn’t know the random sampling function goes from random.samples () in python 2.7 to random.choices () in python 3. There is always something new to learn in python for sure!

So tomorrow I will set up the cost function and maybe start thinking about code oranization :slight_smile:

1 Like

I’m using my own thread, yeah

1 Like

I’m gonna post daily updates in a separate thread yeah. This stuff is cool, over my head though. Hope I get a chance to read through it and learn something.

3 Likes

Thanks! I will keep posting here then :slight_smile:

Yesterday was a busy day, and I didn’t have a chance to sit at my desktop.
But today, I have finished writing the cost function.
I drew out a simple structure of the neural network to start with.

The actual code is not long (see scratches/cost_function.py)…
but there are a lot going into those few lines of codes…
So I basically spent tons of time writing what I am trying to do…
Any suggestion on how to organize comments / codes would be great :smiley: !

Tomorrow, I will write the derivatives of the cost function using backward propagation.
Good night :sleeping: :sleeping:

4 Likes

Today, I spent ~ 8 hours on it, and finally I got it working! I wrote a script to compute the derivatives w.r.t. all the parameters, and another to use a scipy minimizer to find the best thetas. I have uploaded my scripts on my git hub.

Screenshot%20from%202018-12-05%2022-30-59

The model is giving predictions with an accuracy of ~ 40% out of 60k images. There are so many things on my to-do list:

  1. better code organizations. In particular, it should be a class instead of functions and blocks of codes.
  2. better time performance. Right now it took ~ 3 hours. Need to get rid of some for loops, array manipulations, and redundant computations.
  3. better predictive performance. need to figure out the best NN structure / regularization parameters for best results using both training and testing data sets.
  4. better documentations. Its pretty hard to explain the details, but I will put as much explanation as I can. BTW Andrew Ng’s ML course on coursera is legit for those who want to know how NN works!

Luckily, I still have ~3.5 weeks to work on those :smiley:

5 Likes

Sorry I didn’t post my update in the past two days. But I have finished writing a data_loader class. I am a self-taught python-er, and I am not sure when I should use @property in a class.

It turned out i have tons of properties such as number of selected training or testing samples / number of pixels per image / the actual training or testing images or labels … They will be needed later when working with the neural network class…

I

What are people’s rule of thumb when deciding an attribute should be considered as property? Are there any limits on how many is good or bad?

Another python question… how do you guys usually write str () ? Based on the current status of the class, the str should be different… For example, the str when no samples are loaded is different from that when a set of training sample is loaded… and so i have this huge str() …

I don’t know if this is the best way to do it… Any suggestion is welcome :smiley:

2 Likes

A lot of this stuff is style, so it’s just my opinion.

Typically when I write out a dynamic str() method. I just create a list, and append the things I want to have displayed. Basically:

parts = []

if foo:
    parts.append("Foo: {}".format(foo))
if bar:
   ...

# and then join it at the end however I want
return ",".join(parts)

Most of the time, I don’t use properties. Since attributes can be referenced directly with python, I just don’t feel the need.

foo.bar = 5
print(foo.bar)

The times I do use properties is if I don’t want ugly get_foo() or set_foo() accessor methods. The property decorator is useful if you want to have an attribute that requires some additional processing inside the class before setting or before returning it.

class Foo:
    def __init__(self):
        self._bar = [1, 2, 3]
        self._baz = 5.3

    @property
    def bar(self):
        # return a copy of the list instead of returning it directly
        # so it can't be modified outside the class
        return self._bar.copy()

    @property
    def baz(self):
        # maintain an internal type. return a converted value
        return int(self._baz)

Hope it’s useful :slight_smile:

1 Like

This is great! Thanks for the tips :smiley: I like the .join!
That is exactly why I have so many properties. I want to access the numbers of testing / training samples, the samples themselves, the number of pixels etc. I will come back that later as I keep writing and realizing what are actually needed. Thanks again! These are great !

And of course… I have been slacking in the past 3 days… Lets hope I can make some progress tonight … :sweat_smile: