我最近开始学习 PETSc,在尝试完成一些简单的任务时遇到了问题。这段代码有什么问题:
static char help[] = "Test 2d DMDAs Vecs.\n\n";
#include <petscdm.h>
#include <petscdmda.h>
#include <petscsys.h>
PetscReal process_value(int rank, int i) {
return i*PetscPowScalar(10,rank*2);
}
int main(int argc,char **argv) {
PetscErrorCode ierr;
PetscMPIInt rank;
PetscInt M = -5,N = -3;
DM da;
Vec local,global;
ierr = PetscInitialize(&argc,&argv,(char*)0,help);CHKERRQ(ierr);
ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
ierr = DMDACreate2d(PETSC_COMM_WORLD , DM_BOUNDARY_NONE , DM_BOUNDARY_NONE
, DMDA_STENCIL_BOX , M , N , PETSC_DECIDE, PETSC_DECIDE
, 1 , 1 , NULL , NULL , &da); CHKERRQ(ierr);
ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);
ierr = DMCreateLocalVector(da,&local);CHKERRQ(ierr);
{
PetscInt v,i, j, xm, ym, xs, ys;
PetscScalar **array;
ierr = DMDAGetCorners(da, &xs, &ys, 0, &xm, &ym, 0); CHKERRQ(ierr);
PetscSynchronizedPrintf(PETSC_COMM_WORLD,"%d:xs=%d\txm=%d\tys=%d\tym=%d\n",rank,xs,xm,ys,ym);
PetscSynchronizedFlush(PETSC_COMM_WORLD,PETSC_STDOUT);
ierr = DMDAVecGetArray(da, global, &array); CHKERRQ(ierr);
v=0;
for (j = ys; j < ys + ym; j++) {
for (i = xs; i < xs + xm; i++) {
array[j][i] = process_value(rank,v+=1);
}
}
ierr = DMDAVecRestoreArray(da, global, &array); CHKERRQ(ierr);
}
ierr = VecView(global,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
ierr = VecDestroy(&local);CHKERRQ(ierr);
ierr = VecDestroy(&global);CHKERRQ(ierr);
ierr = DMDestroy(&da);CHKERRQ(ierr);
ierr = PetscFinalize();
return 0;
}
它用进程等级标记的数量填充小数组。成功编译并链接后,结果如下:
> mpiexec -n 2 ./problem
0:xs=0 xm=3 ys=0 ym=3
1:xs=3 xm=2 ys=0 ym=3
Vec Object: 2 MPI processes
type: mpi
Vec Object:Vec_0x84000004_0 2 MPI processes
type: mpi
Process [0]
1.
2.
3.
100.
200.
4.
5.
6.
300.
Process [1]
400.
7.
8.
9.
500.
600.
>
VecView 显示进程已写入属于其他进程的位置。哪里有错误?DMDAVecGetArray/DMDAVecRestoreArray 给出错误的数组或 VecView 不适合查看从 DM 对象获得的 Vec?