find something strange(maybe a bug) in computing convergence rate in adaptive solver

Issue #878 duplicate
1161799932@qq.com created an issue

There are two pieces of exactly same code among the comment symbol "#############".

But the first one is good, the second one results in Segmentation fault. I don't know why.

Thanks in advance.

The source code is below:

from dolfin import *

def my_adaptive_solver(u_e, f, mesh):
    V = FunctionSpace(mesh, "Lagrange", 1)

    bc = DirichletBC(V, u_e, DomainBoundary())

    u = TrialFunction(V)
    v = TestFunction(V)
    a = inner(grad(u), grad(v))*dx()
    L = f*v*dx

    # Define function for the solution
    u = Function(V)
    # Define goal functional (quantity of interest)
    M = u*dx()
    tol = 1.e-4

    problem = LinearVariationalProblem(a, L, u, bc)
    solver = AdaptiveLinearVariationalSolver(problem, M)
    solver.parameters['max_iterations'] = 20
    solver.solve(tol)
    #solver.summary()
    #data = solver.adaptive_data()

    u_h_list = []; u_h_list.append(u.root_node())
    num_dofs_list = []; num_dofs_list.append(u.root_node().function_space().dim())
    temp = u.root_node()
    while temp.has_child():
        u_h_list.append(temp.child())
        num_dofs_list.append(temp.child().function_space().dim())
        temp = temp.child()
##############################
    error = [errornorm(u_e, u_h_list[i], norm_type='H10') for i in range(len(u_h_list))]
    h = [num_dofs_list[i]**(-1.0/2) for i in range(len(num_dofs_list))]
    rate = []
    from math import log as ln
    for i in range(len(u_h_list) - 1):
        rate.append(ln(error[i]/error[i+1]) / ln(h[i]/h[i+1]))
    print rate
##############################

    return u_h_list, num_dofs_list


if __name__ == '__main__':
    import pdb
    pdb.set_trace()
    mesh = UnitSquareMesh(2, 2)
    u_e = Expression('1 + x[0]*x[0] + 2*x[1]*x[1]', degree=2)
    f = Constant(-6.0)

    u_h_list, num_dofs_list = my_adaptive_solver(u_e, f, mesh)

##############################
    error = [errornorm(u_e, u_h_list[i], norm_type='H10') for i in range(len(u_h_list))]
    h = [num_dofs_list[i]**(-1.0/2) for i in range(len(num_dofs_list))]
    rate = []
    from math import log as ln
    for i in range(len(u_h_list) - 1):
        rate.append(ln(error[i]/error[i+1]) / ln(h[i]/h[i+1]))
    print rate
##############################

Comments (6)

  1. Jan Blechta

    This is clearly caused by flawed memory management in the adaptivity pipeline. This should be resolved with #642 and #643. To demonstrate this (and workaround the problem), apply the following patch to your MWE:

    32c32,37
    <         temp = temp.child()
    ---
    >         temp_ = temp.child()
    >         temp_._parent = temp
    >         temp._child = temp_
    >         temp._solver = solver
    >         temp._problem = problem
    >         temp = temp_
    

    This is just some trial-and-error hacking with lifetime of the objects. The whole issue is given by the fact that some objects are destroyed prematurely because reference counting is not done properly through shared pointers but no-delete pointers are used.

  2. Log in to comment