meta-intel/lib/oeqa/runtime/miutils/tests/squeezenet_model_download_test.py
Yeoh Ee Peng 1e514838df oeqa/runtime/cases/dldt: Enable inference engine and model optimizer tests
Add sanity tests for inference engine:
   - test inference engine c/cpp shared library
   - test inference engine python api
   - test inference engine cpu, gpu, myriad plugin

Add sanity tests for model optimizer
   - test model optmizer can generate ir

Licenses:
   - classification_sample.py
     license: Apache 2.0
     source: <install_root>/deployment_tools/inference_engine/samples/*

Signed-off-by: Yeoh Ee Peng <ee.peng.yeoh@intel.com>
Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
2019-12-10 13:31:24 +08:00

26 lines
1.2 KiB
Python

class SqueezenetModelDownloadTest(object):
download_files = {'squeezenet1.1.prototxt': 'https://raw.githubusercontent.com/DeepScale/SqueezeNet/a47b6f13d30985279789d08053d37013d67d131b/SqueezeNet_v1.1/deploy.prototxt',
'squeezenet1.1.caffemodel': 'https://github.com/DeepScale/SqueezeNet/raw/a47b6f13d30985279789d08053d37013d67d131b/SqueezeNet_v1.1/squeezenet_v1.1.caffemodel'}
def __init__(self, target, work_dir):
self.target = target
self.work_dir = work_dir
def setup(self):
self.target.run('mkdir -p %s' % self.work_dir)
def tear_down(self):
self.target.run('rm -rf %s' % self.work_dir)
def test_can_download_squeezenet_model(self, proxy_port):
return self.target.run('cd %s; wget %s -e https_proxy=%s' %
(self.work_dir,
self.download_files['squeezenet1.1.caffemodel'],
proxy_port))
def test_can_download_squeezenet_prototxt(self, proxy_port):
return self.target.run('cd %s; wget %s -e https_proxy=%s' %
(self.work_dir,
self.download_files['squeezenet1.1.prototxt'],
proxy_port))