Vendor Recommendation AI with Python on SAP BTP + CAPM Integration
- Raven Infotech
- Jun 24
- 2 min read

SAP BTP isn’t just about enterprise services—it’s a powerful playground for innovation. In this tutorial-style blog, we’ll walk through how to:
Build a simple ML model in Python
Deploy it as an AI microservice API on SAP BTP Cloud Foundry
Call this API from a Node.js CAPM backend
Let’s say you want to recommend the best supplier for a product based on past average ratings. We'll use basic data, build an ML model in Python, and integrate it into a real-world enterprise app.
Use Case: Vendor Recommendation Engine
Supplier_ID | Product | Avg_Rating |
SUP001 | Laptop | 4.5 |
SUP002 | Laptop | 3.8 |
SUP003 | Monitor | 4.2 |
Based on this, we want to predict which supplier is most likely to be the best for a given product + rating.
Part 1: Build & Deploy AI Model on SAP BTP Cloud Foundry
1. Create a Python App Locally
File : train_model.py
# Save this script to train and store the model
import pandas as pd
from sklearn.ensemble import RandomForestClassifier
from sklearn.preprocessing import LabelEncoder
import pickle
data = pd.DataFrame({
'Supplier_ID': ['SUP001', 'SUP002', 'SUP003', 'SUP001', 'SUP002'],
'Product': ['Laptop', 'Laptop', 'Monitor', 'Monitor', 'Keyboard'],
'Avg_Rating': [4.5, 3.8, 4.2, 4.0, 4.6]
})
le_supplier = LabelEncoder()
le_product = LabelEncoder()
data['Supplier_ID_enc'] = le_supplier.fit_transform(data['Supplier_ID'])
data['Product_enc'] = le_product.fit_transform(data['Product'])
X = data[['Product_enc', 'Avg_Rating']]
y = data['Supplier_ID_enc']
model = RandomForestClassifier()
model.fit(X, y)
# Save model and encoders
pickle.dump(model, open("vendor_model.pkl", "wb"))
pickle.dump(le_supplier, open("supplier_encoder.pkl", "wb"))
pickle.dump(le_product, open("product_encoder.pkl", "wb"))
Run this once to create model artifacts.
2. Create the Flask API (app.py)
from flask import Flask, request, jsonify
import pickle
app = Flask(__name__)
model = pickle.load(open("vendor_model.pkl", "rb"))
supplier_encoder = pickle.load(open("supplier_encoder.pkl", "rb"))
product_encoder = pickle.load(open("product_encoder.pkl", "rb"))
@app.route('/recommend', methods=['GET'])
def recommend_supplier():
product = request.args.get('product')
rating = float(request.args.get('rating'))
try:
product_encoded = product_encoder.transform([product])[0]
prediction = model.predict([[product_encoded, rating]])
supplier = supplier_encoder.inverse_transform(prediction)[0]
return jsonify({"recommended_supplier": supplier})
except:
return jsonify({"error": "Product not found"}), 400
if __name__ == '__main__':
app.run(host='0.0.0.0')
3. Create requirements.txt
flask
scikit-learn
pandas
4. Create manifest.yml for Cloud Foundry
applications:
- name: vendor-ai-api
memory: 256M
buildpack: python_buildpack
command: python app.py
path: .
5. Deploy to SAP BTP CF
cf login
cf target -o your-org -s your-space
cf push
You’ll now get a public URL like:
Part 2: Integrate in CAPM (Node.js)
1. In your CAPM app, install Axios:
npm install axios
2. Create a custom handler (srv/vendor-service.js)
const cds = require('@sap/cds');
const axios = require('axios');
module.exports = cds.service.impl(async function () {
const { VendorRecommendation } = this.entities;
this.on('recommendVendor', async (req) => {
const { product, rating } = req.data;
try {
const response = await axios.get(
`https://vendor-ai-api.cfapps.<region>.hana.ondemand.com/recommend`,
{ params: { product, rating } }
);
return {
product,
rating,
recommendedSupplier: response.data.recommended_supplier
};
} catch (e) {
req.error(500, `AI API error: ${e.message}`);
}
});
});
3. Define the action in your service (srv/service.cds)
service VendorService {
action recommendVendor(product: String, rating: Decimal) returns {
product: String;
rating: Decimal;
recommendedSupplier: String;
};
}
4. Run locally
cds watch
Call via POST:
POST /vendor-service/recommendVendor
{
"product": "Laptop",
"rating": 4.2
}
Final Thoughts
With just a few Python lines and a Flask API deployed to SAP BTP, you’ve:
Built a working custom AI microservice
Connected it to a CAPM backend app
Created a real, data-powered recommendation system
This is exactly how enterprise apps evolve into AI-driven platforms, one service at a time.
Comments