这是Jupyter notebook格式的代码:[char-rnn in PyTorch.ipynb][2]。你可以点击这个网页最上面那个按钮Open in Colab,就可以在Google的Colab服务中打开,并使用免费的GPU进行训练。所有的东西加起来大概有75行代码,我将在这篇博文中尽可能地详细解释。
If we want 300 characters worth of text, we just repeat this process 300 times.
### the results!
Here’s some generated output from the model where I set `temperature = 1` in the prediction function. It’s kind of like English, which is pretty impressive given that this model needed to “learn” English from scratch and is totally based on character sequences.
It doesn’t make any _sense_, but what did we expect really.
> “An who was you colotal said that have to have been a little crimantable and beamed home the beetle. “I shall be in the head of the green for the sound of the wood. The pastor. “I child hand through the emperor’s sorthes, where the mother was a great deal down the conscious, which are all the gleam of the wood they saw the last great of the emperor’s forments, the house of a large gone there was nothing of the wonded the sound of which she saw in the converse of the beetle. “I shall know happy to him. This stories herself and the sound of the young mons feathery in the green safe.”
>
> “That was the pastor. The some and hand on the water sound of the beauty be and home to have been consider and tree and the face. The some to the froghesses and stringing to the sea, and the yellow was too intention, he was not a warm to the pastor. The pastor which are the faten to go and the world from the bell, why really the laborer’s back of most handsome that she was a caperven and the confectioned and thoughts were seated to have great made
Here’s some more generated output at `temperature=0.1`, which weights its character choices closer to “just pick the highest probability character every time”. This makes the output a lot more repetitive:
> ole the sound of the beauty of the beetle. “She was a great emperor of the sea, and the sun was so warm to the confectioned the beetle. “I shall be so many for the beetle. “I shall be so many for the beetle. “I shall be so standen for the world, and the sun was so warm to the sea, and the sun was so warm to the sea, and the sound of the world from the bell, where the beetle was the sea, and the sound of the world from the bell, where the beetle was the sea, and the sound of the wood flowers and the sound of the wood, and the sound of the world from the bell, where the world from the wood, and the sound of the
It’s weirdly obsessed with beetles and confectioners, and the sun, and the sea. Seems fine!
### that’s all!
my results are nowhere near as good as Karpathy’s so far, maybe due to one of the following:
1. not enough training data
2. I got bored with training after an hour and didn’t have the patience to babysit the Colab notebook for longer
3. he used a 2-layer LSTM with more hidden parameters than me, I have 1 layer