KiGB is a unified framework for learning gradient boosted decision trees for regression and classification tasks while leveraging human advice for achieving better performance

This package contains two implementation of Knowledge-intensive Gradient Boosting framework:

  • with Gradient Boosted Decision Tree of Scikit-learn ( SKiGB )
  • with Gradient Boosted Decision Tree of LightGBM ( LKiGB )

Both these implementations are done in python.

Basic Usage

'''Step 1: Import the class'''
from core.lgbm.lkigb import LKiGB as KiGB

'''Step 2: Import dataset'''
train_data = pd.read_csv('train.csv')
X_train = train_data.drop('target', axis=1)
Y_train = train_data['target']

'''Step 3: Provide monotonic influence information'''
advice  = np.array([1,0,1,1-1], dtype=int)
# 0 for features with no influence, +1 for features with isotonic influence, -1 for antitonic influences

'''Step 4: Train the model'''
kigb = KiGB(lamda=1, epsilon=0.1, advice=advice, objective='regression', trees=30), y_train)

'''Step 5: Test the model'''

To use Scikit version of KiGB, import from core.scikit.skigb import SKiGB


If you build on this code or the ideas of this paper, please use the following citation.

  author = {Harsha Kokel and Phillip Odom and Shuo Yang and Sriraam Natarajan},
  title  = {A Unified Framework for Knowledge Intensive Gradient Boosting: Leveraging Human Experts for Noisy Sparse Domains},
  booktitle = {AAAI},
  year   = {2020}


  • Harsha Kokel and Sriraam Natarajan acknowledge the support of Turvo Inc. and CwC Program Contract W911NF-15-1-0461 with the US Defense Advanced Research Projects Agency (DARPA) and the Army Research Office (ARO).