Geek Girl Joy

Artificial Intelligence, Simulations & Software


December 2018

Happy Holidays Panic

So, this happened.

And you’re probably concerned… I don’t blame you!

The sad truth is that many people are struggling under heavy student debt loans.

Drowning under mortgage payments they can’t afford or simply living paycheck to paycheck!

It’s easy to get discouraged during the holiday season in a normal year but add insult to injury and suddenly you’re staring recession or even full-blown economic depression right in the face.

Continue reading “Happy Holidays Panic”

How to take Training Snapshots of your PHP FANN Neural Network

Today we’re going to look at how to take snapshots of your FANN Artificial Neural Network  (ANN) while it’s training.

But why?

Well, maybe you want to fork competing GANs to progressively create ever increasingly  believable deepfakes because you want some of that sweet, sweet deepfake money… um… I mean, build a best seller Writer Bot… 😉 😛

Perhaps you want to compare how different processes, configurations or algorithms affect your ANN during training rather than waiting till the end.

Or… maybe, you have a pug and toddler who conspire to take turns crawling under your desk and push the power switch on the surge protector ~60 hours into a long and complex training process and you’d rather not lose days of work… again!

OK… not a third time smarty pants! 😛

Continue reading “How to take Training Snapshots of your PHP FANN Neural Network”

Finished Prototype


Welcome, today we’re going to wrap up our Parts of Speech tagging bot prototype.

That doesn’t mean that we’re done with this code (or database), it’s just that this prototype is functioning well enough that it fulfills what we set out to accomplish (Tokenizing & Lexing Natural Language) and further development at this point is unnecessary for our purposes but there are tons of other improvements we could add in the future if we want to turn this prototype into a full product and I encourage you to experiment! 🙂

There were some changes to the Database (last week and this week) but I have uploaded the most recent data to the GitHub Project Repo: Part-Of-Speech-Tagger

We last left off with 3 unknown lexemes and the word ‘jumps’ was miss tagged.

Additionally, our tagging process is more or less effective but it’s not quick and that simply wont do if we want to use our  tagger to do fun things in the future, so we’ll cover how we solve that today too.

There is much to discuss so let’s start with the miss tagged words.

Continue reading “Finished Prototype”

Blog at

Up ↑

%d bloggers like this: