0

嗨,我正在使用 kfserving v.0.5.1 组件来托管模型。我能够从 s3 下载和部署模型,但在尝试访问它时遇到问题。

部署后 kfserving 输出以下端点

http://recommendation-model.kubeflow-user-example-com.example.com

我无法从节点外部和内部访问它。环顾四周后,我将入口网关设置为LoadBalancerfromNodePort并将 sslip.io 添加到 knative-serving 的 config-map config-domain

我跟着knative-dn-config

<IPofMyLB>.sslip.io: ""

之后我尝试推理模型,但没有收到来自服务器的错误或响应

curl -d '{"instances": ["abc"]}'   -X POST http://recommendation-model.kubeflow-user-example-com.<IPofMyLB>.sslip.io/v1/models/recommendation-model:predict

我尝试简单地获取推理端点,输出是这样的:

curl -v -X GET http://recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
Note: Unnecessary use of -X or --request, GET is already inferred.
*   Trying ELBIP...
* TCP_NODELAY set
* Connected to recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io (ELBIP) port 80 (#0)
> GET / HTTP/1.1
> Host: recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 302 Found
< content-type: text/html; charset=utf-8
< location: /dex/auth?client_id=kubeflow-oidc-authservice&redirect_uri=%2Flogin%2Foidc&response_type=code&scope=profile+email+groups+openid&state=MTYyMjY0MjU5M3xFd3dBRUV4MVVERkljREpRVUc1SVdXeDFaVkk9fOEnkjCWGNj6WPOgFhv2BUwNSKHsYyBR2kyj9_0geX2f
< date: Wed, 02 Jun 2021 14:03:13 GMT
< content-length: 269
< x-envoy-upstream-service-time: 1
< server: istio-envoy
<
<a href="/dex/auth?client_id=kubeflow-oidc-authservice&amp;redirect_uri=%2Flogin%2Foidc&amp;response_type=code&amp;scope=profile+email+groups+openid&amp;state=MTYyMjY0MjU5M3xFd3dBRUV4MVVERkljREpRVUc1SVdXeDFaVkk9fOEnkjCWGNj6WPOgFhv2BUwNSKHsYyBR2kyj9_0geX2f">Found</a>.

* Connection #0 to host recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io left intact
* Closing connection 0
(base) ahsan@Ahsans-MacBook-Pro kfserving % curl -v -X GET http://recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
Note: Unnecessary use of -X or --request, GET is already inferred.
*   Trying ELBIP...
* TCP_NODELAY set
* Connected to recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io (ELBIP) port 80 (#0)
> GET / HTTP/1.1
> Host: recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 302 Found
< content-type: text/html; charset=utf-8
< location: /dex/auth?client_id=kubeflow-oidc-authservice&redirect_uri=%2Flogin%2Foidc&response_type=code&scope=profile+email+groups+openid&state=MTYyMjY0MzE3OXxFd3dBRUhOdmFrSk9iVkU0Wms1VmMzVnhXbkU9fECJ3_U0SaWkR441eIWq-AJbFAV29-2Bk8uxPAOxPJD0
< date: Wed, 02 Jun 2021 14:12:59 GMT
< content-length: 269
< x-envoy-upstream-service-time: 1
< server: istio-envoy
<
<a href="/dex/auth?client_id=kubeflow-oidc-authservice&amp;redirect_uri=%2Flogin%2Foidc&amp;response_type=code&amp;scope=profile+email+groups+openid&amp;state=MTYyMjY0MzE3OXxFd3dBRUhOdmFrSk9iVkU0Wms1VmMzVnhXbkU9fECJ3_U0SaWkR441eIWq-AJbFAV29-2Bk8uxPAOxPJD0">Found</a>.

* Connection #0 to host recommendation-model.kubeflow-user-example-com.ELBIP.sslip.io left intact
* Closing connection 0

模型目录结构

recommendation_model/
└── 1
    ├── assets
    ├── keras_metadata.pb
    ├── saved_model.pb
    └── variables
        ├── variables.data-00000-of-00001
        └── variables.index

不知道如何让服务工作,因为这是我管道的最后一部分

4

1 回答 1

2

当通过 访问推理服务时LoadBalanceristio-ingressgateway与 相比,您的请求具有额外的控制层NodePort,这由 Istio 安全策略决定。

curl 的响应消息表明您安装了带有 DEX 身份验证的 Istio。

istio -dex 指南提供了如何设置 cookie 以验证推理请求的示例。

于 2021-06-03T13:15:15.377 回答