我正在尝试执行一个简单的步骤,在该步骤中,我必须创建多个索引,其中将存储 aws comprehend 方法的输出数据。`
for hit in hits:
filter_hit = json.dumps(self.delete_keys(hit['_source']))
entity_data = ComprehendClass.detect_entities(self,filter_hit,language_code)
# Create index entity and upload data to entity
requests.put(entity_index_url, auth= HTTPBasicAuth(USERNAME, PASSWORD), data= entity_data)
pii_data = ComprehendClass.detect_pii_entities(self,filter_hit,language_code)
# Create index pii and upload data to pii
requests.put(pii_index_url, auth = HTTPBasicAuth(USERNAME, PASSWORD),data = pii_data)
dominant_language_data = ComprehendClass.detect_dominant_language(self,filter_hit)
# Create index language and upload data to language
requests.put(language_url, auth = HTTPBasicAuth(USERNAME, PASSWORD),data = dominant_language_data)
key_phrases_data = ComprehendClass.detect_key_phraes(self,filter_hit,language_code)
# Create index key_phrases and upload data to key_phrases
requests.put(keyphrases_url, auth = HTTPBasicAuth(USERNAME, PASSWORD),data = key_phrases_data)
sentiment_data = ComprehendClass.detect_sentiment(self,filter_hit,language_code)
# Create index sentiment and upload data to sentiment
requests.put(sentiment_url, auth = HTTPBasicAuth(USERNAME, PASSWORD), data = sentiment_data)
syntax_data = ComprehendClass.detect_syntax(self,filter_hit,language_code)
# Create index syntax and upload data to syntax
requests.put(syntax_url, auth = HTTPBasicAuth(USERNAME, PASSWORD), data = syntax_data)`
我是 python requests 包和 aws elasticsearch 的新手,但想知道是否有办法做到这一点。