我在一堆不同的进程上有矩阵,但它们的大小不一定相同,所以我认为我不能使用 comm.gather() 或等价物。因此,我编写了自己的函数,但我遇到了非常奇怪的错误:
self.test_array_global[list_count:list_count+ count_symbol_list[count+1]] = temp_ar
由于某种原因,我无法弄清楚这一行似乎没有正确访问 self.test_array_global 索引,因为我收到了这个错误。即使我手动尝试在上面设置 self.test_array_global 的形状,它似乎仍然不起作用..
Traceback (most recent call last):
File "AudModMPI.py", line 183, in <module>
aud.get_test_array()
File "AudModMPI.py", line 149, in get_test_array
self.test_array_global[list_count:list_count+ count_symbol_list[count+1]] = temp_ar
ValueError: could not broadcast input array from shape (82,300,50) into shape (0,300,50)
这是整个功能:
def get_test_array(self, to_rank=0):
if rank != to_rank:
comm.send(self.test_symbol_list, dest=to_rank, tag=20)
if (rank == to_rank):
itera = range(num_proc)
itera.remove(to_rank)
self.test_symbol_list_global = []
self.test_symbol_list_global.extend(self.test_symbol_list)
count_symbol_list = [len(self.test_symbol_list)]
temp_list = None
for i in itera:
temp_list = comm.recv(source=i, tag=20)
count_symbol_list.append(len(temp_list))
self.test_symbol_list_global.extend(temp_list)
if rank != to_rank:
comm.Send(self.test_array[0:len(self.test_symbol_list)], dest=to_rank, tag=21)
if rank == to_rank and num_proc > 1:
self.test_array_global = zeros((count_symbol_list[-1], self.test_array.shape[1], self.test_array.shape[2]))
list_count = 0
for count, i in enumerate(itera):
list_count += count_symbol_list[count]
temp_ar = empty((count_symbol_list[count+1], self.test_array.shape[1], self.test_array.shape[2]))
comm.Recv(temp_ar, source=i, tag=21)
print rank, list_count
print rank, count_symbol_list[count+1]
self.test_array_global[list_count:list_count+ count_symbol_list[count+1]] = temp_ar
self.test_array_global[0:len(self.test_symbol_list)] = self.test_array[0:len(self.test_symbol_list)]