I built a model and put it behind AI Platform.
I would like to get additional data from Firestore (which I am using for document storage) before serving a prediction.
Is this possible?
If not, how to go around this problem? One way is to create another microservice which connects to Firebase and returns the object, but I'd rather keep everything inside 1 container.
For example, using the custom prediction pipeline I have:
class MyPredictor(object):
"""An example Predictor for an AI Platform custom prediction routine."""
def __init__(self, model):
"""Stores artifacts for prediction. Only initialized via `from_path`.
"""
self._model = model
def predict(self, instances, **kwargs):
"""Performs custom prediction.
Preprocesses inputs, then performs prediction using the trained
scikit-learn model.
Args:
instances: A list of prediction input instances.
**kwargs: A dictionary of keyword args provided as additional
fields on the predict request body.
Returns:
A list of outputs containing the prediction results.
"""
# inputs = np.asarray(instances)
# outputs = self._model.predict(inputs)
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
cred = credentials.ApplicationDefault()
return cred
But this gives me an Internal Error when serving a prediction on AI-Platform.