|
vodkat posted:For a super hardcore markov bot you make a per character neutral network like this guy http://karpathy.github.io/2015/05/21/rnn-effectiveness/ but that is way beyond my lovely programming skills or my MacBooks integrated graphics. also if anyone else wants to do this, the successor to karpathy's library is https://github.com/jcjohnson/torch-rnn coffeetable fucked around with this message at 19:00 on Feb 20, 2016 |
# ¿ Feb 20, 2016 18:58 |
|
|
# ¿ Apr 27, 2024 16:21 |
|
vodkat posted:I made a huge corpus (2 million +) word corpus for markov botman from stupid silicon valley VC but he's still not generating very good tweets. I'm gonna try and write a better tweet generator over the next few days and report back with the results. post the file
|
# ¿ Feb 24, 2016 18:26 |
|
Snapchat A Titty posted:e: are the hashes literally generated by the markov chain too? like those arent real hashes from the commits you fed it? it's an rnn, not a mc. mc wouldn't be able to indent things consistently (context isn't long enough) see http://karpathy.github.io/2015/05/21/rnn-effectiveness/ https://github.com/jcjohnson/torch-rnn
|
# ¿ Mar 5, 2016 07:39 |
|
vodkat posted:I'm playing around with these at the moment but I still haven't got anything I'm happy with. Charter rnn seems to work fine but takes ages to run since I don't have a gpu. Torch-rnn is much faster and efficient but the output so far has mostly been unusable. if it's too slow, make the network smaller. io aside, runtime is quadratic in the number of neurons and linear in the sequence length and number of layers. performance should degrade fairly gracefully as you cut it down to size
|
# ¿ Mar 6, 2016 15:09 |
|
unpacked robinhood posted:ms stole the markov twitter bot fed from other tweets idea except its called an AI now probably based on this paper http://arxiv.org/pdf/1506.06714.pdf
|
# ¿ Mar 24, 2016 15:01 |