Skip to content

Bag of Tricks in Machine Learning

Public Personal Notebook

Search
  • Home
  • random
  • paper
  • list
  • nlp
  • reinforcementlearning
  • gan
  • cv
  • Search

random

A Hitchhiker’s Guide On Distributed Training of Deep Neural Networks

January 14, 2019January 14, 2019 by admin

https://arxiv.org/abs/1810.11787

Categories random Tags distributedtraining Leave a comment

Sleepwalk – walk through embedding (S. Anders)

January 12, 2019January 12, 2019 by admin

https://anders-biostat.github.io/sleepwalk/

Categories nlp, random Leave a comment

DNN in SQL

January 11, 2019January 11, 2019 by admin

https://towardsdatascience.com/deep-neural-network-implemented-in-pure-sql-over-bigquery-f3ed245814d3

Categories random Tags sql Leave a comment

Random seed optimization in GAN research?

January 10, 2019January 10, 2019 by admin

Two quotes from this paper: Are GANs Created Equal? A Large-Scale Study

  • Even with everything else being fixed, varying the random seed may influence on the results
  • authors often report the best FID which opens the door for random seed optimization
Categories random Tags gan Leave a comment

37 Reasons why your Neural Network is not working (Slav Ivanov)

January 9, 2019January 9, 2019 by admin

https://blog.slavv.com/37-reasons-why-your-neural-network-is-not-working-4020854bd607

Categories random Tags nn Leave a comment

Why should we install TensorFlow with conda?

January 9, 2019January 9, 2019 by admin

https://towardsdatascience.com/stop-installing-tensorflow-using-pip-for-performance-sake-5854f9d9eb0c

Categories random Tags random Leave a comment

Tensorflow with Python 3.7?

January 9, 2019January 9, 2019 by admin

Now Anaconda’s default Python is 3.7 however, it is tricky to install Tensorflow on top of it:

https://github.com/tensorflow/tensorflow/issues/23478#issue-377079541

Better to downgrade Python now: https://stackoverflow.com/questions/52584907/how-to-downgrade-python-from-3-7-to-3-6

Categories random Tags random Leave a comment

Trick = Experience + Magic

August 25, 2019January 8, 2019 by admin

(how large is the size of the decoder)
Categories random Tags random Leave a comment

Best practice to follow this website

  1. In Feedly, click “add content”
  2. Input the url “bagoftricks.ml” and follow

ShujianFollow

Shujian
Retweet on TwitterShujian Retweeted
gneubigGraham Neubig@gneubig·
20 Jan

If you're looking for some nice videos on cutting-edge NLP research, check out the @LTIatCMU YouTube Channel with presentations by LTI members and guest speakers! https://www.youtube.com/c/LTIatCMU

我们的中国朋友也可以观看bilibili:https://space.bilibili.com/1377044784

Reply on Twitter 1351922935469060096Retweet on Twitter 135192293546906009676Like on Twitter 1351922935469060096367Twitter 1351922935469060096
Retweet on TwitterShujian Retweeted
gkobergerGregory Koberger@gkoberger·
18 Jan

Now this is patience....

Reply on Twitter 1351271003989544965Retweet on Twitter 13512710039895449652174Like on Twitter 135127100398954496515991Twitter 1351271003989544965
Retweet on TwitterShujian Retweeted
arxiv_in_reviewarXiv in review@arxiv_in_review·
7 Jan

#NeurIPS2020 Parameterized Explainer for Graph Neural Network. (arXiv:2011.04573v1 [cs\.LG] CROSS LISTED) http://arxiv.org/abs/2011.04573

Reply on Twitter 1346988622566850563Retweet on Twitter 13469886225668505631Like on Twitter 13469886225668505632Twitter 1346988622566850563
Retweet on TwitterShujian Retweeted
facebookaiFacebook AI@facebookai·
24 Dec

We’re open-sourcing a new system to train computer vision models using Transformers. Data-efficient image Transformers (DeiT) is a high-performance image classification model requiring less data & computing resources to train than previous AI models. https://ai.facebook.com/blog/data-efficient-image-transformers-a-promising-new-technique-for-image-classification/

Reply on Twitter 1341957881059373056Retweet on Twitter 1341957881059373056548Like on Twitter 13419578810593730562050Twitter 1341957881059373056
Retweet on TwitterShujian Retweeted
slashML/MachineLearning@slashML·
11 Nov

Long Range Arena: A Benchmark for Efficient Transformers https://www.reddit.com/r/MachineLearning/comments/jrxe8b/r_long_range_arena_a_benchmark_for_efficient/

Reply on Twitter 1326386125229989895Retweet on Twitter 13263861252299898951Like on Twitter 132638612522998989511Twitter 1326386125229989895
Load More...

Categories

  • ml (2)
  • nlp (9)
  • papers (1)
  • random (8)
  • reinforcementlearning (1)
  • Uncategorized (57)

Tags

anomaly (1) automl (1) ctr (1) cv (2) data (1) distributedtraining (1) gan (1) kaggle (1) list (2) ml (2) nlp (9) nn (1) paper (1) random (3) reinforcementlearning (1) sql (1)

Recent Posts

  • Data Augmentation in NLP
  • PyTorch nn.DistributedDataParallel
  • Unsupervised Pre-Training of Image Features on Non-Curated Data (by FAIR)
  • Image Classification Papers
  • PyTorch fast

Archives

  • May 2020 (1)
  • October 2019 (1)
  • August 2019 (11)
  • July 2019 (8)
  • June 2019 (6)
  • May 2019 (1)
  • April 2019 (11)
  • March 2019 (4)
  • February 2019 (2)
  • January 2019 (32)

Recent Comments

    January 2021
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    25262728293031
    « May    

    • 0
    • 143
    • 29,113
    • 15,296
    • 77
    • 0

    Recent Posts

    • Data Augmentation in NLP
    • PyTorch nn.DistributedDataParallel
    • Unsupervised Pre-Training of Image Features on Non-Curated Data (by FAIR)
    • Image Classification Papers
    • PyTorch fast

    Categories

    • ml (2)
    • nlp (9)
    • papers (1)
    • random (8)
    • reinforcementlearning (1)
    • Uncategorized (57)

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    © 2021 Bag of Tricks in Machine Learning • Powered by GeneratePress
    Scroll back to top