admin管理员组

文章数量:1420181

I'm trying to download a pretrained tensorflow.js model including weights, to be used offline in python in the standard version of tensorflow as part of a project that is not on an early stage by any means, so switching to tensorflow.js is not a possibility. But I cant just figure out how to download those models and if its necessary to to do some conversion to the model.

I'm aware that in javascript I can access the models and use them by calling them like this but how do I actually get the .ckpt files or the model frozen if thats the case?

<script src="/@tensorflow/[email protected]"></script>

<script src="/@tensorflow-models/[email protected]"></script>

My final objective is to get the frozen model files, and get the outputs like is done in the normal version of tensorflow. Also this will be used in an offline environment, so any online reference would not be useful.

Thanks for your replies

I'm trying to download a pretrained tensorflow.js model including weights, to be used offline in python in the standard version of tensorflow as part of a project that is not on an early stage by any means, so switching to tensorflow.js is not a possibility. But I cant just figure out how to download those models and if its necessary to to do some conversion to the model.

I'm aware that in javascript I can access the models and use them by calling them like this but how do I actually get the .ckpt files or the model frozen if thats the case?

<script src="https://cdn.jsdelivr/npm/@tensorflow/[email protected]"></script>

<script src="https://cdn.jsdelivr/npm/@tensorflow-models/[email protected]"></script>

My final objective is to get the frozen model files, and get the outputs like is done in the normal version of tensorflow. Also this will be used in an offline environment, so any online reference would not be useful.

Thanks for your replies

Share Improve this question edited Jan 7, 2019 at 1:46 Boris asked Jan 6, 2019 at 21:16 BorisBoris 3013 silver badges10 bronze badges 4
  • I wonder why you want to create your model in js and export it in python. Why prevent you from creating the model directly in python ? – edkeveked Commented Jan 6, 2019 at 23:07
  • no I dont wanna create a new one I wanna download a trained one, sorry I wasnt clear about it – Boris Commented Jan 7, 2019 at 1:45
  • @edkeveked — isn't it perfectly clear that the OP wants to use the fully-trained model from a Python environment and hence is asking how to download the model weights directly? – Jivan Commented Feb 19, 2020 at 23:39
  • @Boris have you found a solution? I'm looking for the exact same thing here. – Jivan Commented Feb 19, 2020 at 23:39
Add a ment  | 

2 Answers 2

Reset to default 3

It is possible to save the model topology and its weights by calling the method save of the model.

const model = tf.sequential();
model.add(tf.layers.dense(
     {units: 1, inputShape: [10], activation: 'sigmoid'}));
const saveResult = await model.save('downloads://mymodel');
// This will trigger downloading of two files:
//   'mymodel.json' and 'mymodel.weights.bin'.
console.log(saveResult);

There are different scheme strings depending on where to save the model and its weights (localStorage, IndexDB, ...). doc

I went to https://storage.googleapis./tfjs-models/ and found the directory listing all the files. I found the relevant files (I wanted all the mobilenet float, as opposed to quantized mobileNet), and populated this file_uris list.

base_uri = "https://storage.googleapis./tfjs-models/"
file_uris = [
    "savedmodel/posenet/mobilenet/float/050/group1-shard1of1.bin",
    "savedmodel/posenet/mobilenet/float/050/model-stride16.json",
    "savedmodel/posenet/mobilenet/float/050/model-stride8.json",
    "savedmodel/posenet/mobilenet/float/075/group1-shard1of2.bin",
    "savedmodel/posenet/mobilenet/float/075/group1-shard2of2.bin",
    "savedmodel/posenet/mobilenet/float/075/model-stride16.json",
    "savedmodel/posenet/mobilenet/float/075/model-stride8.json",
    "savedmodel/posenet/mobilenet/float/100/group1-shard1of4.bin",
    "savedmodel/posenet/mobilenet/float/100/group1-shard2of4.bin",
    "savedmodel/posenet/mobilenet/float/100/group1-shard3of4.bin",
    "savedmodel/posenet/mobilenet/float/100/model-stride16.json",
    "savedmodel/posenet/mobilenet/float/100/model-stride8.json"
]

Then I used python to iteratively download the files into their same folders.

from urllib.request import urlretrieve
import requests
from pathlib import Path

for file_uri in file_uris:
    uri = base_uri + file_uri
    save_path = "/".join(file_uri.split("/")[:-1])
    Path(save_path).mkdir(parents=True, exist_ok=True)
    urlretrieve(uri, file_uri)
    print(path, file_uri)

I enjoyed Jupyter Lab (Jupyter Notebook is also good) when experimenting with this code.

With this, you'll get a folder with bin files (the weight) and the json files (the graph model). Unfortunately, these are graph models, so they cannot be converted into SavedModels, and so they are absolutely useless to you. Let me know if someone find a way of running these tfjs graph model files in regular TensorFlow (preferably 2.0+).


You can also download zip files with the 'entire' model from TFHub, for example, a 2 byte quantized ResNet PoseNet is available here.

本文标签: javascriptHow to download models and weights from tensorflowjsStack Overflow