Python crashes while running Light gbm with custom loss function

0

I have been trying to implement a light gbm model with a custom loss function (SMAPE), however when I run it python crashes with the following error

Unhandled exception at 0x00007FF841D04E65 (lib_lightgbm.dll) in python.exe: 0xC0000005: Access violation reading location 0x000002624155B500

The lightgbm model seems to be running when there is no custom loss function and I am just using the inbuilt rmse.

What does this error mean? What can I do to rectify it? Thanks for your help in advance

def smape(preds, target):
  '''
  Function to calculate SMAPE
  '''
  n = len(preds)
  masked_arr = ~(target==0)
  preds, target = preds[masked_arr], target[masked_arr]
  num = np.abs(preds-target)
  denom = np.abs(target)
  smape_val = (100*np.sum(num/denom))/n
  return smape_val

def lgbm_smape(preds, train_data):
  '''
  Custom Evaluation Function for LGBM
  '''
  labels = train_data.get_label()
  smape_val = smape(np.expm1(preds), np.expm1(labels))
  return 'MAPE', smape_val, False

def grads(x, y) : 
  return 2 * y * (x - y) / ((x + y) * (x + y) * abs(x - y))


def hesss(x, y) :
  return (-4 * y * (x - y)) / ((x + y) * (x + y) * (x + y) * abs(x - y)) 

def lgbm_obj(preds, train_data):
  '''
  Custom obj Function for LGBM
  '''
  labels = train_data.get_label()
  masked_arr = ~((preds==0)&(labels==0))
  preds, labels = preds[masked_arr], labels[masked_arr]
  grad = grads(labels,preds)
  hess = hesss(labels,preds)
  return grad, hess

  import lightgbm as lgb
  params = {'task':'train', 
            'boosting_type':'dart', 
            'objective':'regression', 
            'metric': {'rmse'}, 'num_leaves': 100, 'learning_rate': 0.003, 
            'feature_fraction': 0.8, 'max_depth': 6, 'verbose': 0, 
            'num_boost_round':35000, 'nthread':-1
            }
  lgbtrain = lgb.Dataset(data=X_train.values, 
                     label=y_train['demandQuantity'].values)

  lgbval = lgb.Dataset(data=X_test.values, 
                     label=y_test['demandQuantity'].values, 
                      reference=lgbtrain)


  model = lgb.train(params, lgbtrain, 
                    num_boost_round=params['num_boost_round'], 
                    valid_sets=[lgbtrain, lgbval], feval=lgbm_smape,
                    fobj=lgbm_obj,verbose_eval=200
                    )
python
machine-learning
loss-function
lightgbm
asked on Stack Overflow Aug 30, 2018 by Dwarkesh23 • edited Aug 30, 2018 by SherylHohman

0 Answers

Nobody has answered this question yet.


User contributions licensed under CC BY-SA 3.0