Caffe Python MemoryDataLayer Segmentation Fault

转载请注明出处,楼燚(yì)航的blog,http://home.cnblogs.com/louyihang-loves-baiyan/

因为利用Pyhon来做数据的预处理比较方便,因此在data_layer选择上,采用了MemoryDataLayer,可以比较方便的直接用Python 根据set_input_array进行feed数据,然后再调用solver进行step就可以了。说一下我碰到的问题,当时检查了一下感觉没有哪里出错,但是报

Segmentation Fault(Core Abort)

感觉好囧,最怕这个了。一般说段错误都是内存错误,比如数组越界,无效的指针,引用被释放的资源等等。经过一步步debug之后发现问题出现在

solver.net.set_input_arrays

solver在将数据传送到网络低端的时候报错。那么接下来找到python目录下的caffe\python\caffe\_caffe.cpp文件,这个文件是基于boost python的,用来将C++的接口导出,供python调用。进一步我们找到相关函数

void Net_SetInputArrays(Net<Dtype>* net, bp::object data_obj,
    bp::object labels_obj) {
  // check that this network has an input MemoryDataLayer
  shared_ptr<MemoryDataLayer<Dtype> > md_layer =
    boost::dynamic_pointer_cast<MemoryDataLayer<Dtype> >(net->layers()[0]);
  if (!md_layer) {
    throw std::runtime_error("set_input_arrays may only be called if the"
        " first layer is a MemoryDataLayer");
  }
  // check that we were passed appropriately-sized contiguous memory
  PyArrayObject* data_arr =
      reinterpret_cast<PyArrayObject*>(data_obj.ptr());
  PyArrayObject* labels_arr =
      reinterpret_cast<PyArrayObject*>(labels_obj.ptr());
  CheckContiguousArray(data_arr, "data array", md_layer->channels(),
      md_layer->height(), md_layer->width());
  CheckContiguousArray(labels_arr, "labels array", 1, 1, 1);
  if (PyArray_DIMS(data_arr)[0] != PyArray_DIMS(labels_arr)[0]) {
    throw std::runtime_error("data and labels must have the same first"
        " dimension");
  }
  if (PyArray_DIMS(data_arr)[0] % md_layer->batch_size() != 0) {
    throw std::runtime_error("first dimensions of input arrays must be a"
        " multiple of batch size");
  }
  md_layer->Reset(static_cast<Dtype*>(PyArray_DATA(data_arr)),
      static_cast<Dtype*>(PyArray_DATA(labels_arr)),
      PyArray_DIMS(data_arr)[0]);
}

问题就出在了最后的一个语句

 md_layer->Reset(static_cast<Dtype*>(PyArray_DATA(data_arr)),
      static_cast<Dtype*>(PyArray_DATA(labels_arr)),
      PyArray_DIMS(data_arr)[0]);

当执行reset MemoryDataLayer的Reset函数时出错。于此同时在github上也发现了同样的问题,https://github.com/BVLC/caffe/issues/2334也是因为Python MemoryDataLayer引发的段错误。他说到,在里面把传入的data和label做要给深拷贝就可以解决,估计是运行时数据已经被释放了,只传了指针引发了段错误。

解决方案:
找到caffe\src\layers\memory_data_layer.cpp打开,给Reset函数添加相应的深拷贝代码

template <typename Dtype>
void MemoryDataLayer<Dtype>::Reset(Dtype* data, Dtype* labels, int n) {
  CHECK(data);
  CHECK(labels);
  CHECK_EQ(n % batch_size_, 0) << "n must be a multiple of batch size";
  // Warn with transformation parameters since a memory array is meant to
  // be generic and no transformations are done with Reset().
  if (this->layer_param_.has_transform_param()) {
    LOG(WARNING) << this->type() << " does not transform array data on Reset()";
  }
  // data_ = data; 将这里注释掉,
  // labels_ = labels;

  //以下部分是进行深拷贝
  if(data_)
      delete []data_;
  if(labels_)
      delete [] labels_;
  data_ = new Dtype[n*size_];
  labels_ = new Dtype[n * num_tasks_];

  memcpy(data_, data, sizeof(Dtype)*n*size_);
  memcpy(labels_, labels, sizeof(Dtype) * n * num_tasks_);

  n_ = n;
  pos_ = 0;
}

Ok进行修改之后,回到Caffe的根目录,执行make all,make test,``make runtest,make pycaffe`。重新编译完成之后,重新运行就好了,继续开始训练。

posted @ 2016-03-26 16:16  Hello~again  阅读(4705)  评论(3编辑  收藏  举报