Keras source codes: how to reach? - python

To be able to use Keras as a programming tool, sometimes one needs to see the source code of methods. I know that each of the functions in Keras is implemented openly and is accessible to the public. But unfortunately it is not trivial to find the code on the web before you are experienced enough. For example, it is not explained in https://keras.io/ what is the easiest way to find the source for a specific method.
My question here is can someone please point me to the implementation of softmax activation of Keras with Tensorflow backened or recommend how is a good way to get to it?

You can search the repository on github using the search bar. You'll find it in keras/activations.py, which invokes the same function from the keras backend. All the backends are at keras/backend, and the tensorflow backend specifically is at keras/backend/tensorflow_backend.py. In tensorflow, you can find the corresponding kernel definition at tensorflow/core/kernels/softmax_op.

There is another way to get the source code which might be useful especially if you are not using the latest version (available on github), so I am adding it here
You can always find the keras source code directly on your PC if you have installed the keras package. The directory it is installed in is : /python3.your_version/site-packages/keras

It looks like the Keras source code can be found in Github for Keras.
As opposed to Pytorch whose documentation for each function has a direct link to the corresponding source code, in Keras the two seems to be disconnected.
One way to find the source for a specific component in that is manually going through folders in the above GIT repository..
I did that and found that it can be found in Keras Softmax Source Code.
There might be better ways of getting to this source code, but I am not aware of.

Related

Is it possible to run PyTorch in brython script?

I'd like to embed a simple PyTorch model in a webpage. Is this something accomplishable with brython? If not, is there another tool available that would allow for PyTorch scripts to be executed without a separate server hosting the code?
I am not sure about brython, but I believe torchserve can be used to accomplish your task.
Take a look at its documentation:
https://pytorch.org/serve/
EDIT: Based on Comments:
So I found this repo, that is like a substitute for Tensorflow.js except if works for Pytorch. It is called torch.js. It should allow your model to work on a program made by node.js. Since this repo is not as official as tensorflow.js, another thing I would suggest is to possibly convert your Pytorch model to ONNX to then Tensorflow to then Tensorflow.js. Then you will be able to accomplish your task. Some links about tensorflow.js can be found here

Scripts missing for GPT-2 fine tune, and inference in Hugging-face GitHub?

I am following the documentation on the hugging face website, in there they say that to fine-tune GPT-2 I should use the script
run_lm_finetuning.py for fine-tuning, and the script run_generation.py
for inference.
However, both scripts don't actually exist on GitHub anymore.
Does anybody know whether the documentation is outdated? or where to find those two scripts?
Thanks
It looks like they've been moved around a couple times and the docs are indeed out of date, the current version can be found in run_language_modeling.py here https://github.com/huggingface/transformers/tree/master/examples/language-modeling
Links have been moved around again, you can find everything related to language modeling here.
All needed information including location of the old script is in the README

Use trained Tensorflow Graphs in Matlab

I would like to train tensorflow models with the python API but use the trained graphs for inference in Matlab. I searched for possibilities to do this, but I can't seem to figure it out.
Does anybody have a good idea how to do this? Do I have to compile the model with bazel? Do I do it with tensorflow serving? Do I load the metagraph in a C++ function that I include in Matlab?
Please keep in mind that I'm an enigeer and don't have extensive programming knowledge :)
In case someone lands here with a similar question, I'd like to suggest tensorflow.m - a Matlab package I am currently writing (available on GitHub).
Although still in development, simple functionality like importing a frozen graph and running an inference is already possible (see the examples) - I believe this is what you were looking for?
The advantage is that you don't need any expensive toolbox nor a Python/Tensorflow installation on your machine. I'd be glad if the package can be of use for someone looking for similar solutions; even more so, in case you extend/implement something and open a PR.

Neural Network implementation using Pybrain

I am new to Python and I want to implement a simple Neural Network in Python and I am using Pycharm. I explore which library is used for this purpose and I found that Pybrain can be used. Successfully installed it on my system and configured with Pycharm.
Now I am searching a simple sample code for Neural Network using Pybrain lib but did not found a complete code. Also in doc of Pybrain some more explanation is required. Is there any way I can found detailed documentation of functionalities used in Pybrain?
As a beginner did I follow the correct path? May be you will feel that this question should not be asked but as a beginner I have to solve it. If anyone can help me in this regard I would be thankful to you.

What is the purpose of the tf.contrib module in Tensorflow?

I'm curious about what tf.contrib is, and why the code would be included in TensorFlow, but not in the main repository.
Furthermore, looking at the example here (from the tensorflow master branch), and I want to find the source for tf.contrib.layers.sparse_column_with_hash_bucket.
It seems like some cool routines, but I wanted to make sure they were properly using queues, etc, for pre-fetching/pre-processing examples to actually use them in a production setting.
It appears to be documented here, but it is from the tflearn project, but tf.contrib.layers.sparse_column_with_hash_bucket doesn't seem to be in that repository either.
In general, tf.contrib contains contributed code. It is meant to contain features and contributions that eventually should get merged into core TensorFlow, but whose interfaces may still change, or which require some testing to see whether they can find broader acceptance.
The code in tf.contrib isn't supported by the Tensorflow team. It is included in the hope that it is helpful, but it might change or be removed at any time; there are no guarantees.
The source of tf.contrib.layers.sparse_column_with_hash_bucket can be found at
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/layers/python/layers/feature_column.py#L365

Categories

Resources