https://github.com/datitran/raccoon_dataset a>) pour entraîner mon propre ensemble de données et j'ai rencontré un problème. Informations sur l'environnement.:
OS:Windows 10 64 bits
CUDA et CUDNN: CUDA 10 / CUDNN 7.5.0
Version Tensorflow: 1.13.1 (GPU)
Version Python: 3.7.0 64 bits (Anaconda, Inc. sur win32)
Modèle: ssd_mobilenet_v1_coco
Problème:
Je veux commencer à m'entraîner par
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0.
For more information, please see:
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md
* https://github.com/tensorflow/addons
If you depend on functionality not listed there, please file an issue.
WARNING:tensorflow:From D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\platform\app.py:125: main (from __main__) is deprecated and will be removed in a future version.
Instructions for updating:
Use object_detection/model_main.py.
WARNING:tensorflow:From D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\legacy\trainer.py:266: create_global_step (from tensorflow.contrib.framework.python.ops.variables) is deprecated and will be removed in a future version.
Instructions for updating:
Please switch to tf.train.create_global_step
WARNING:tensorflow:From D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
WARNING:tensorflow:num_readers has been reduced to 1 to match input file shards.
WARNING:tensorflow:From D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\builders\dataset_builder.py:80: parallel_interleave (from tensorflow.contrib.data.python.ops.interleave_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use `tf.data.experimental.parallel_interleave(...)`.
Traceback (most recent call last):
File "train.py", line 186, in <module>
tf.app.run()
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\platform\app.py", line 125, in run
_sys.exit(main(argv))
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\util\deprecation.py", line 324, in new_func
return func(*args, **kwargs)
File "train.py", line 182, in main
graph_hook_fn=graph_rewriter_fn)
File "D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\legacy\trainer.py", line 280, in train
train_config.prefetch_queue_capacity, data_augmentation_options)
File "D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\legacy\trainer.py", line 59, in create_input_queue
tensor_dict = create_tensor_dict_fn()
File "train.py", line 123, in get_next
dataset_builder.build(config)).get_next()
File "D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\builders\dataset_builder.py", line 134, in build
config.input_path[:], input_reader_config)
File "D:\Anaconda\envs\tensorflow\Lib\models\research\object_detection\builders\dataset_builder.py", line 80, in read_dataset
sloppy=config.shuffle))
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 1605, in apply
return DatasetV1Adapter(super(DatasetV1, self).apply(transformation_func))
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 1127, in apply
dataset = transformation_func(self)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\experimental\ops\interleave_ops.py", line 88, in _apply_fn
buffer_output_elements, prefetch_input_elements)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\readers.py", line 133, in __init__
cycle_length, block_length)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2827, in __init__
super(InterleaveDataset, self).__init__(input_dataset, map_func)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2798, in __init__
map_func, self._transformation_name(), dataset=input_dataset)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2124, in __init__
self._function.add_to_graph(ops.get_default_graph())
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 490, in add_to_graph
self._create_definition_if_needed()
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 341, in _create_definition_if_needed
self._create_definition_if_needed_impl()
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 355, in _create_definition_if_needed_impl
whitelisted_stateful_ops=self._whitelisted_stateful_ops)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 883, in func_graph_from_py_func
outputs = func(*func_graph.inputs)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2099, in tf_data_structured_function_wrapper
ret = func(*nested_args)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\readers.py", line 247, in __init__
filenames, compression_type, buffer_size, num_parallel_reads)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\readers.py", line 212, in __init__
self._impl = filenames.flat_map(read_one_file)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 1005, in flat_map
return FlatMapDataset(self, map_func)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2798, in __init__
map_func, self._transformation_name(), dataset=input_dataset)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2124, in __init__
self._function.add_to_graph(ops.get_default_graph())
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 490, in add_to_graph
self._create_definition_if_needed()
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 341, in _create_definition_if_needed
self._create_definition_if_needed_impl()
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 355, in _create_definition_if_needed_impl
whitelisted_stateful_ops=self._whitelisted_stateful_ops)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\function.py", line 883, in func_graph_from_py_func
outputs = func(*func_graph.inputs)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\dataset_ops.py", line 2099, in tf_data_structured_function_wrapper
ret = func(*nested_args)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\readers.py", line 209, in read_one_file
return _TFRecordDataset(filename, compression_type, buffer_size)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\ops\readers.py", line 111, in __init__
argument_dtype=dtypes.string)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\data\util\convert.py", line 35, in optional_param_to_tensor
argument_default, dtype=argument_dtype, name=argument_name)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\constant_op.py", line 245, in constant
allow_broadcast=True)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\constant_op.py", line 283, in _constant_impl
allow_broadcast=allow_broadcast))
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\tensor_util.py", line 501, in make_tensor_proto
(dtype, nparray.dtype, values))
TypeError: Incompatible types: <dtype: 'string'> vs. object. Value is
et j'ai rencontré une erreur
python train.py
--logtostderr
--pipeline_config_path=Z:/Train/ssd_mobilenet_v1_coco_ship.config
--train_dir=Z:/Train/train