tensorflow の import を試す:
(venv) $ python
>>> import tensorflow as tf
2021-04-17 12:29:16.608101: E tensorflow/core/platform/hadoop/hadoop_file_system.cc:132]
HadoopFileSystem load error: libhdfs.so: cannot open shared object file: No such file or directory
Traceback (most recent call last):
File "", line 1, in
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 98, in
from tensorflow_core import *
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/__init__.py", line 28, in
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "", line 1019, in _handle_fromlist
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 50, in __getattr__
module = self._load()
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 44, in _load
module = _importlib.import_module(self.__name__)
File "/usr/lib/python3.7/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/__init__.py", line 88, in
from tensorflow.python import keras
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/keras/__init__.py", line 26, in
from tensorflow.python.keras import activations
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/keras/__init__.py", line 26, in
from tensorflow.python.keras import activations
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/keras/activations.py", line 23, in
from tensorflow.python.keras.utils.generic_utils import deserialize_keras_object
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/keras/utils/__init__.py", line 34, in
from tensorflow.python.keras.utils.io_utils import HDF5Matrix
File "/home/pi/venv/lib/python3.7/site-packages/tensorflow_core/python/keras/utils/io_utils.py", line 30, in
import h5py
File "/home/pi/venv/lib/python3.7/site-packages/h5py/__init__.py", line 25, in
from . import _errors
File "h5py/_errors.pyx", line 1, in init h5py._errors
ValueError: numpy.ndarray size changed, may indicate binary incompatibility.
Expected 44 from C header, got 40 from PyObject
>>>
エラーメッセージの最後が numpy がどうのとなっているので,numpy をアップデートしてみる:
(venv) $ pip show numpy
Name: numpy
Version: 1.16.2
‥‥
(venv) $ pip install --upgrade numpy
Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
Requirement already satisfied: numpy in /usr/lib/python3/dist-packages (1.16.2)
Collecting numpy
Downloading https://www.piwheels.org/simple/numpy/numpy-1.20.2-cp37-cp37m-linux_armv7l.whl (11.6 MB)
|████████████████████████████████| 11.6 MB 16 kB/s
Installing collected packages: numpy
Attempting uninstall: numpy
Found existing installation: numpy 1.16.2
Not uninstalling numpy at /usr/lib/python3/dist-packages, outside environment /home/pi/venv
Can't uninstall 'numpy'. No files were found to uninstall.
Successfully installed numpy-1.20.2
(venv) $ pip show numpy
Name: numpy
Version: 1.20.2
‥‥
- 改めて「import tensorflow as tf」をテスト:
(venv) $ python
>>> import tensorflow as tf
2021-04-17 13:09:51.797695:
E tensorflow/core/platform/hadoop/hadoop_file_system.cc:132]
HadoopFileSystem load error: libhdfs.so: cannot open shared object file: No such file or directory
>>> print(tf.reduce_sum(tf.random.normal([1000, 1000])))
Tensor("Sum:0", shape=(), dtype=float32)
>>>
$ python -c "import tensorflow as tf;print(tf.reduce_sum(tf.random.normal([1000, 1000])))"
2021-04-17 13:19:25.567367: E tensorflow/core/platform/hadoop/hadoop_file_system.cc:132]
HadoopFileSystem load error: libhdfs.so: cannot open shared object file: No such file or directory
Tensor("Sum:0", shape=(), dtype=float32)
(venv) $
別のエラーメッセジが現れるようになったが,「import tensorflow as tf」はできている。
|