Hi,

I solve a problem thousands of times a day. The structure of the problem remains the same (ie no new contraints, only new coeffs in A_eq).

I create a new xp problem at each iteration but can't delete it to from memory (del p or p=None probably leaves the instance in the back end - running memory profiler shows memory increasing at each increment and ultimately runs out of memory - after a few hours - the two pictures attached show the memory increasing or staying flat if the creation of the problem is taken out of the loop). I use Python 3.6, XPress updated to 8.4.7.

Question is: is there a way to delete the instance to free memory (del problem or problem=None are inefficient)? Alternatively I could create only one problem instance at the beginning and then only update the problem. How can I implement such solution (I can change bounds with chngbounds but don't see how to update A_eq * x = b_eq contraint. Note that A_ub does not change nor the objective. ).

The problem is of the form:

minimize: c^T * x

subject to: A_ub * x <= b_ub

A_eq * x == b_eq

lb<= x <= ub

Sample code below - A_eq and A_ub files attached. Note that the only change I need at each run is by providing a new A_eq matrix and update bounds (can do with: ). A_ub unchanged.

Many Thanks

Nicolas

import xpress as xp

import numpy as np

import datetime as dt

global generalFolder

generalFolder='C:\\Users\\Administrator\\Desktop\\'

nb_dogs=6

A_ub=np.loadtxt(generalFolder+'A_ub.txt')

A_eq=np.loadtxt(generalFolder+'A_eq.txt')

'''build objective function. we want to minimize the last variable'''

obj=np.zeros((12*nb_dogs+10*nb_dogs*(nb_dogs-1)+1,1))

obj[12*nb_dogs+10*nb_dogs*(nb_dogs-1)][0]=1

c=np.ndarray.flatten(obj)

b_ub=np.ndarray.flatten(np.zeros((A_ub.shape[0],1))) #b_ub is a vector of 0 except last two rows

b_ub[nb_dogs*(nb_dogs-1)]=-0.1

b_ub[nb_dogs*(nb_dogs-1)+1]=500

b_eq=np.ndarray.flatten(np.zeros((A_eq.shape[0],1)))

bounds=tuple(((0,100) for i in range(0,nb_dogs)))

bounds=bounds+tuple((0,0) for i in range(nb_dogs,8*nb_dogs+4*nb_dogs*(nb_dogs-1)))

bounds=bounds+tuple(((0,None) for _ in range(0,4*nb_dogs+6*nb_dogs*(nb_dogs-1))))

bounds=bounds+tuple((None,None)for _ in range(0,1))

'''converts list of tuples to vectors for XPress'''

lb=[]

ub=[]

for b in bounds:

if b[0]==None:

lb.append(-xp.infinity)

else:

lb.append(b[0])

if b[1]==None:

ub.append(xp.infinity)

else:

ub.append(b[1])

'''

writes problem for xPress and solves

'''

for i in range(0,500):

p = xp.problem ()

x = np.array ([xp.var(lb=-xp.infinity,ub=xp.infinity, vartype=xp.continuous) for i in range(A_ub.shape[1])])

p.addVariable ( x )

p.addConstraint (xp.Dot(A_ub,x) <= b_ub)

p.addConstraint (xp.Dot(A_eq,x) == b_eq)

p.addConstraint(x<=ub)

p.addConstraint(lb<=x)

p.setObjective (xp.Dot(c,x),sense=xp.minimize)

p.solve()

# del p or p=None don't prevent the memory from increasing

Hi Nicolas,

this seems a reiteration of the memory leak using the Dot function posted a while ago, which is being investigated. Do you observe the same problem when replacing the Dot operation with the explicit sums, as suggested in your previous post Python - memory leak ?

Note that del p or setting p=None have no impact as the garbage collector would do the same. It seems there are objects, created with Dot(), that p is hanging onto that prevent its deletion, and hence the memory build-up.

Let me know if you get an improvement on memory usage with the above replacement. I can reproduce the leak arising with the Dot function.

Pietro