Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: search failed with error channel not available[channel=channel distribution is not serviceable] after upgrade from 2.4-20250116-622af571-amd64 to 2.5-20250116-c945efa8-amd64 #39339

Closed
1 task done
zhuwenxing opened this issue Jan 16, 2025 · 4 comments
Assignees
Labels
kind/bug Issues or changes related a bug priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now. severity/critical Critical, lead to crash, data missing, wrong result, function totally doesn't work. triage/accepted Indicates an issue or PR is ready to be actively worked on.
Milestone

Comments

@zhuwenxing
Copy link
Contributor

Is there an existing issue for this?

  • I have searched the existing issues

Environment

- Milvus version: 2.4-20250116-622af571-amd64 --> 2.5-20250116-c945efa8-amd64
- Deployment mode(standalone or cluster):standalone
- MQ type(rocksmq, pulsar or kafka): pulsar   
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior


[2025-01-16T07:49:13.187Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_BIN_IVF_FLAT_is_compacted_is_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=106, message=failed to search: loaded collection do not found any channel in target, may be in recovery: collection on recovering[collection=455346884315045188])>

[2025-01-16T07:49:13.187Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_HNSW_is_compacted_not_compacted_segment_status_only_growing_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>

[2025-01-16T07:49:13.187Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_SQ8_is_compacted_not_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>

[2025-01-16T07:49:13.188Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_BIN_IVF_FLAT_is_compacted_not_compacted_segment_status_only_growing_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>

[2025-01-16T07:49:13.188Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_PQ_is_compacted_is_compacted_segment_status_only_growing_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>

[2025-01-16T07:49:13.188Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_HNSW_is_compacted_not_compacted_segment_status_only_growing_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>
[2025-01-16T07:49:13.185Z] self = <test_action_second_deployment.TestActionSecondDeployment object at 0x7fb5b6db3f10>

[2025-01-16T07:49:13.185Z] all_collection_name = 'deploy_test_index_type_HNSW_is_compacted_not_compacted_segment_status_only_growing_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000'

[2025-01-16T07:49:13.185Z] data_size = 3000

[2025-01-16T07:49:13.185Z] 

[2025-01-16T07:49:13.185Z]     @pytest.mark.tags(CaseLabel.L3)

[2025-01-16T07:49:13.185Z]     def test_check(self, all_collection_name, data_size):

[2025-01-16T07:49:13.185Z]         """

[2025-01-16T07:49:13.185Z]         before reinstall: create collection

[2025-01-16T07:49:13.185Z]         """

[2025-01-16T07:49:13.185Z]         self._connect()

[2025-01-16T07:49:13.185Z]         ms = MilvusSys()

[2025-01-16T07:49:13.185Z]         name = all_collection_name

[2025-01-16T07:49:13.185Z]         is_binary = False

[2025-01-16T07:49:13.185Z]         if "BIN" in name:

[2025-01-16T07:49:13.185Z]             is_binary = True

[2025-01-16T07:49:13.185Z]         collection_w, _ = self.collection_wrap.init_collection(name=name)

[2025-01-16T07:49:13.185Z]         self.collection_w = collection_w

[2025-01-16T07:49:13.185Z]         schema = collection_w.schema

[2025-01-16T07:49:13.185Z]         data_type = [field.dtype for field in schema.fields]

[2025-01-16T07:49:13.185Z]         field_name = [field.name for field in schema.fields]

[2025-01-16T07:49:13.185Z]         type_field_map = dict(zip(data_type, field_name))

[2025-01-16T07:49:13.185Z]         if is_binary:

[2025-01-16T07:49:13.185Z]             default_index_field = ct.default_binary_vec_field_name

[2025-01-16T07:49:13.185Z]             vector_index_type = "BIN_IVF_FLAT"

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             default_index_field = ct.default_float_vec_field_name

[2025-01-16T07:49:13.185Z]             vector_index_type = "IVF_FLAT"

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         binary_vector_index_types = [index.params["index_type"] for index in collection_w.indexes if

[2025-01-16T07:49:13.185Z]                                      index.field_name == type_field_map.get(100, "")]

[2025-01-16T07:49:13.185Z]         float_vector_index_types = [index.params["index_type"] for index in collection_w.indexes if

[2025-01-16T07:49:13.185Z]                                     index.field_name == type_field_map.get(101, "")]

[2025-01-16T07:49:13.185Z]         index_field_map = dict([(index.field_name, index.index_name) for index in collection_w.indexes])

[2025-01-16T07:49:13.185Z]         index_names = [index.index_name for index in collection_w.indexes]  # used to drop index

[2025-01-16T07:49:13.185Z]         vector_index_types = binary_vector_index_types + float_vector_index_types

[2025-01-16T07:49:13.185Z]         if len(vector_index_types) > 0:

[2025-01-16T07:49:13.185Z]             vector_index_type = vector_index_types[0]

[2025-01-16T07:49:13.185Z]         try:

[2025-01-16T07:49:13.185Z]             t0 = time.time()

[2025-01-16T07:49:13.185Z]             self.utility_wrap.wait_for_loading_complete(name)

[2025-01-16T07:49:13.185Z]             log.info(f"wait for {name} loading complete cost {time.time() - t0}")

[2025-01-16T07:49:13.185Z]         except Exception as e:

[2025-01-16T07:49:13.185Z]             log.error(e)

[2025-01-16T07:49:13.185Z]         # get replicas loaded

[2025-01-16T07:49:13.185Z]         try:

[2025-01-16T07:49:13.185Z]             replicas = collection_w.get_replicas(enable_traceback=False)

[2025-01-16T07:49:13.185Z]             replicas_loaded = len(replicas.groups)

[2025-01-16T07:49:13.185Z]         except Exception as e:

[2025-01-16T07:49:13.185Z]             log.error(e)

[2025-01-16T07:49:13.185Z]             replicas_loaded = 0

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         log.info(f"collection {name} has {replicas_loaded} replicas")

[2025-01-16T07:49:13.185Z]         actual_replicas = re.search(r'replica_number_(.*?)_', name).group(1)

[2025-01-16T07:49:13.185Z]         assert replicas_loaded == int(actual_replicas)

[2025-01-16T07:49:13.185Z]         # params for search and query

[2025-01-16T07:49:13.185Z]         if is_binary:

[2025-01-16T07:49:13.185Z]             _, vectors_to_search = cf.gen_binary_vectors(

[2025-01-16T07:49:13.185Z]                 default_nb, default_dim)

[2025-01-16T07:49:13.185Z]             default_search_field = ct.default_binary_vec_field_name

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             vectors_to_search = cf.gen_vectors(default_nb, default_dim)

[2025-01-16T07:49:13.185Z]             default_search_field = ct.default_float_vec_field_name

[2025-01-16T07:49:13.185Z]         search_params = gen_search_param(vector_index_type)[0]

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # load if not loaded

[2025-01-16T07:49:13.185Z]         if replicas_loaded == 0:

[2025-01-16T07:49:13.185Z]             # create index for vector if not exist before load

[2025-01-16T07:49:13.185Z]             is_vector_indexed = False

[2025-01-16T07:49:13.185Z]             index_infos = [index.to_dict() for index in collection_w.indexes]

[2025-01-16T07:49:13.185Z]             for index_info in index_infos:

[2025-01-16T07:49:13.185Z]                 if "metric_type" in index_info.keys() or "metric_type" in index_info["index_param"]:

[2025-01-16T07:49:13.185Z]                     is_vector_indexed = True

[2025-01-16T07:49:13.185Z]                     break

[2025-01-16T07:49:13.185Z]             if is_vector_indexed is False:

[2025-01-16T07:49:13.185Z]                 default_index_param = gen_index_param(vector_index_type)

[2025-01-16T07:49:13.185Z]                 self.create_index(collection_w, default_index_field, default_index_param)

[2025-01-16T07:49:13.185Z]             collection_w.load()

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # search and query

[2025-01-16T07:49:13.185Z]         if "empty" in name:

[2025-01-16T07:49:13.185Z]             # if the collection is empty, the search result should be empty, so no need to check

[2025-01-16T07:49:13.185Z]             check_task = None

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             check_task = CheckTasks.check_search_results

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         collection_w.search(vectors_to_search[:default_nq], default_search_field,

[2025-01-16T07:49:13.185Z]                             search_params, default_limit,

[2025-01-16T07:49:13.185Z]                             default_search_exp,

[2025-01-16T07:49:13.185Z]                             output_fields=[ct.default_int64_field_name],

[2025-01-16T07:49:13.185Z]                             check_task=check_task,

[2025-01-16T07:49:13.185Z]                             check_items={"nq": default_nq,

[2025-01-16T07:49:13.185Z]                                          "limit": default_limit})

[2025-01-16T07:49:13.185Z]         if "empty" in name:

[2025-01-16T07:49:13.185Z]             check_task = None

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             check_task = CheckTasks.check_query_not_empty

[2025-01-16T07:49:13.185Z]         collection_w.query(default_term_expr, output_fields=[ct.default_int64_field_name],

[2025-01-16T07:49:13.185Z]                            check_task=check_task)

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # flush

[2025-01-16T07:49:13.185Z]         if pymilvus_version >= "2.2.0":

[2025-01-16T07:49:13.185Z]             collection_w.flush()

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             collection_w.collection.num_entities

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # search and query

[2025-01-16T07:49:13.185Z]         if "empty" in name:

[2025-01-16T07:49:13.185Z]             check_task = None

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             check_task = CheckTasks.check_search_results

[2025-01-16T07:49:13.185Z]         collection_w.search(vectors_to_search[:default_nq], default_search_field,

[2025-01-16T07:49:13.185Z]                             search_params, default_limit,

[2025-01-16T07:49:13.185Z]                             default_search_exp,

[2025-01-16T07:49:13.185Z]                             output_fields=[ct.default_int64_field_name],

[2025-01-16T07:49:13.185Z]                             check_task=check_task,

[2025-01-16T07:49:13.185Z]                             check_items={"nq": default_nq,

[2025-01-16T07:49:13.185Z]                                          "limit": default_limit})

[2025-01-16T07:49:13.185Z]         if "empty" in name:

[2025-01-16T07:49:13.185Z]             check_task = None

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             check_task = CheckTasks.check_query_not_empty

[2025-01-16T07:49:13.185Z]         collection_w.query(default_term_expr, output_fields=[ct.default_int64_field_name],

[2025-01-16T07:49:13.185Z]                            check_task=check_task)

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # insert data and flush

[2025-01-16T07:49:13.185Z]         for i in range(2):

[2025-01-16T07:49:13.185Z]             self.insert_data_general(insert_data=True, is_binary=is_binary, nb=data_size,

[2025-01-16T07:49:13.185Z]                                     is_flush=False, is_index=True, name=name,

[2025-01-16T07:49:13.185Z]                                      enable_dynamic_field=False, with_json=False)

[2025-01-16T07:49:13.185Z]         if pymilvus_version >= "2.2.0":

[2025-01-16T07:49:13.185Z]             collection_w.flush()

[2025-01-16T07:49:13.185Z]         else:

[2025-01-16T07:49:13.185Z]             collection_w.collection.num_entities

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # delete data

[2025-01-16T07:49:13.185Z]         delete_expr = f"{ct.default_int64_field_name} in [0,1,2,3,4,5,6,7,8,9]"

[2025-01-16T07:49:13.185Z]         collection_w.delete(expr=delete_expr)

[2025-01-16T07:49:13.185Z]     

[2025-01-16T07:49:13.185Z]         # search and query

[2025-01-16T07:49:13.185Z] >       collection_w.search(vectors_to_search[:default_nq], default_search_field,

[2025-01-16T07:49:13.185Z]                             search_params, default_limit,

[2025-01-16T07:49:13.185Z]                             default_search_exp,

[2025-01-16T07:49:13.185Z]                             output_fields=[ct.default_int64_field_name],

[2025-01-16T07:49:13.185Z]                             check_task=CheckTasks.check_search_results,

[2025-01-16T07:49:13.185Z]                             check_items={"nq": default_nq,

[2025-01-16T07:49:13.185Z]                                          "limit": default_limit})

[2025-01-16T07:49:13.185Z] 

[2025-01-16T07:49:13.186Z] testcases/test_action_second_deployment.py:205: 

[2025-01-16T07:49:13.186Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/orm/collection.py:799: in search

[2025-01-16T07:49:13.186Z]     resp = conn.search(

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py:141: in handler

[2025-01-16T07:49:13.186Z]     raise e from e

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py:137: in handler

[2025-01-16T07:49:13.186Z]     return func(*args, **kwargs)

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py:176: in handler

[2025-01-16T07:49:13.186Z]     return func(self, *args, **kwargs)

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py:116: in handler

[2025-01-16T07:49:13.186Z]     raise e from e

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py:86: in handler

[2025-01-16T07:49:13.186Z]     return func(*args, **kwargs)

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/client/grpc_handler.py:836: in search

[2025-01-16T07:49:13.186Z]     return self._execute_search(request, timeout, round_decimal=round_decimal, **kwargs)

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/client/grpc_handler.py:777: in _execute_search

[2025-01-16T07:49:13.186Z]     raise e from e

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/client/grpc_handler.py:766: in _execute_search

[2025-01-16T07:49:13.186Z]     check_status(response.status)

[2025-01-16T07:49:13.186Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

[2025-01-16T07:49:13.186Z] 

[2025-01-16T07:49:13.186Z] status = error_code: NoReplicaAvailable

[2025-01-16T07:49:13.186Z] reason: "fail to search on QueryNode 4: channel not available[channel=channel distribut...884312439987v0: fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable]"

[2025-01-16T07:49:13.186Z] 

[2025-01-16T07:49:13.186Z] 

[2025-01-16T07:49:13.186Z]     def check_status(status: Status):

[2025-01-16T07:49:13.186Z]         if status.code != 0 or status.error_code != 0:

[2025-01-16T07:49:13.186Z] >           raise MilvusException(status.code, status.reason, status.error_code)

[2025-01-16T07:49:13.186Z] E           pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to search on QueryNode 4: channel not available[channel=channel distribution is not serviceable])>

[2025-01-16T07:49:13.186Z] 

[2025-01-16T07:49:13.186Z] /usr/local/lib/python3.10/dist-packages/pymilvus/client/utils.py:63: MilvusException

Expected Behavior

No response

Steps To Reproduce

Milvus Log

failed job: https://qa-jenkins.milvus.io/blue/organizations/jenkins/deploy_test_for_release_cron/detail/deploy_test_for_release_cron/2623/pipeline
log:
artifacts-pulsar-standalone-upgrade-2623-server-logs.tar.gz

cluster: 4am
ns: chaos-testing
pod

 + + kubectl get pods -o wide
 grep pulsar-standalone-upgrade-2623
 pulsar-standalone-upgrade-2623-bookie-0                           1/1     Running                  0                25m     10.104.16.160   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2623-bookie-1                           1/1     Running                  0                25m     10.104.23.202   4am-node27   <none>           <none>
 pulsar-standalone-upgrade-2623-bookie-2                           1/1     Running                  0                25m     10.104.25.94    4am-node30   <none>           <none>
 pulsar-standalone-upgrade-2623-broker-0                           1/1     Running                  0                25m     10.104.33.30    4am-node36   <none>           <none>
 pulsar-standalone-upgrade-2623-etcd-0                             1/1     Running                  0                25m     10.104.16.158   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2623-etcd-1                             1/1     Running                  0                25m     10.104.33.28    4am-node36   <none>           <none>
 pulsar-standalone-upgrade-2623-etcd-2                             1/1     Running                  0                25m     10.104.25.93    4am-node30   <none>           <none>
 pulsar-standalone-upgrade-2623-milvus-standalone-5984655b458wx5   1/1     Running                  3 (24m ago)      25m     10.104.16.159   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2623-minio-8cbdc47c-574f4               1/1     Running                  0                25m     10.104.16.161   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2623-proxy-0                            1/1     Running                  0                25m     10.104.33.33    4am-node36   <none>           <none>
 pulsar-standalone-upgrade-2623-recovery-0                         1/1     Running                  0                25m     10.104.33.32    4am-node36   <none>           <none>
 pulsar-standalone-upgrade-2623-zookeeper-0                        1/1     Running                  0                25m     10.104.33.31    4am-node36   <none>           <none>
 pulsar-standalone-upgrade-2623-zookeeper-1                        1/1     Running                  0                24m     10.104.16.162   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2623-zookeeper-2                        1/1     Running                  0                24m     10.104.15.172   4am-node20   <none>           <none>

Anything else?

No response

@zhuwenxing zhuwenxing added kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Jan 16, 2025
@zhuwenxing zhuwenxing added this to the 2.4.21 milestone Jan 16, 2025
@zhuwenxing zhuwenxing added priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now. severity/critical Critical, lead to crash, data missing, wrong result, function totally doesn't work. labels Jan 16, 2025
@yanliang567 yanliang567 added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Jan 16, 2025
@zhuwenxing
Copy link
Contributor Author

v2.4.20 -- > 2.5-20250116-c945efa8-amd64

some case failed with search error, some case failed with load timeout

failed job: https://qa-jenkins.milvus.io/blue/organizations/jenkins/deploy_test_for_release_cron/detail/deploy_test_for_release_cron/2624/pipeline/427

log:

artifacts-pulsar-standalone-upgrade-2624-server-logs.tar.gz


[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_BIN_IVF_FLAT_is_compacted_not_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=106, message=failed to search: loaded collection do not found any channel in target, may be in recovery: collection on recovering[collection=455347591029518914])>

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_SQ8_is_compacted_is_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=106, message=failed to search: loaded collection do not found any channel in target, may be in recovery: collection on recovering[collection=455347591030323241])>

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_PQ_is_compacted_is_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=106, message=failed to search: loaded collection do not found any channel in target, may be in recovery: collection on recovering[collection=455347591031329247])>

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_SQ8_is_compacted_not_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - Failed: Timeout >240.0s

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_FLAT_is_compacted_is_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - Failed: Timeout >240.0s

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_FLAT_is_compacted_is_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_1_is_deleted_is_deleted_data_size_3000] - pymilvus.exceptions.MilvusException: <MilvusException: (code=106, message=failed to search: loaded collection do not found any channel in target, may be in recovery: collection on recovering[collection=455347591029920800])>

[2025-01-16T08:36:09.059Z] FAILED testcases/test_action_second_deployment.py::TestActionSecondDeployment::test_check[deploy_test_index_type_IVF_PQ_is_compacted_not_compacted_segment_status_all_is_scalar_indexed_is_scalar_indexed_replica_number_0_is_deleted_is_deleted_data_size_3000] - Failed: Timeout >240.0s

[2025-01-16T08:36:09.059Z] =================== 7 failed, 75 passed in 959.28s (0:15:59) ===================

pod info

+ kubectl get pods -o wide
 + grep pulsar-standalone-upgrade-2624
 pulsar-standalone-upgrade-2624-bookie-0                           1/1     Running                  0                26m     10.104.21.66    4am-node24   <none>           <none>
 pulsar-standalone-upgrade-2624-bookie-1                           1/1     Running                  0                26m     10.104.30.94    4am-node38   <none>           <none>
 pulsar-standalone-upgrade-2624-bookie-2                           1/1     Running                  0                26m     10.104.26.205   4am-node32   <none>           <none>
 pulsar-standalone-upgrade-2624-broker-0                           1/1     Running                  0                26m     10.104.16.215   4am-node21   <none>           <none>
 pulsar-standalone-upgrade-2624-etcd-0                             1/1     Running                  0                26m     10.104.21.65    4am-node24   <none>           <none>
 pulsar-standalone-upgrade-2624-etcd-1                             1/1     Running                  0                26m     10.104.30.91    4am-node38   <none>           <none>
 pulsar-standalone-upgrade-2624-etcd-2                             1/1     Running                  0                26m     10.104.23.34    4am-node27   <none>           <none>
 pulsar-standalone-upgrade-2624-milvus-standalone-766d57fc4z8rrh   1/1     Running                  4 (21m ago)      26m     10.104.30.93    4am-node38   <none>           <none>
 pulsar-standalone-upgrade-2624-minio-dbc754867-vd569              1/1     Running                  0                26m     10.104.21.68    4am-node24   <none>           <none>
 pulsar-standalone-upgrade-2624-proxy-0                            1/1     Running                  0                26m     10.104.27.132   4am-node31   <none>           <none>
 pulsar-standalone-upgrade-2624-recovery-0                         1/1     Running                  0                26m     10.104.9.154    4am-node14   <none>           <none>
 pulsar-standalone-upgrade-2624-zookeeper-0                        1/1     Running                  0                26m     10.104.21.67    4am-node24   <none>           <none>
 pulsar-standalone-upgrade-2624-zookeeper-1                        1/1     Running                  0                26m     10.104.30.95    4am-node38   <none>           <none>
 pulsar-standalone-upgrade-2624-zookeeper-2                        1/1     Running                  0                25m     10.104.32.31    4am-node39   <none>           <none>

congqixia added a commit to congqixia/milvus that referenced this issue Jan 16, 2025
congqixia added a commit to congqixia/milvus that referenced this issue Jan 16, 2025
congqixia added a commit to congqixia/milvus that referenced this issue Jan 17, 2025
Related to milvus-io#39339

Extra indexes can be ignored for most cases since sorted pk column
already provided indexing features

Signed-off-by: Congqi Xia <[email protected]>
congqixia added a commit to congqixia/milvus that referenced this issue Jan 17, 2025
Related to milvus-io#39339

Extra indexes can be ignored for most cases since sorted pk column
already provided indexing features

Signed-off-by: Congqi Xia <[email protected]>
sre-ci-robot pushed a commit that referenced this issue Jan 17, 2025
Cherry-pick from master
pr: #39389
Related to #39339

Extra indexes can be ignored for most cases since sorted pk column
already provided indexing features

---------

Signed-off-by: Congqi Xia <[email protected]>
sre-ci-robot pushed a commit that referenced this issue Jan 17, 2025
congqixia added a commit to congqixia/milvus that referenced this issue Jan 20, 2025
Related to milvus-io#39339
Previous PR milvus-io#39389 only skips append index into segment

Also related to milvus-io#39428

Signed-off-by: Congqi Xia <[email protected]>
congqixia added a commit to congqixia/milvus that referenced this issue Jan 20, 2025
Related to milvus-io#39339
Previous PR milvus-io#39389 only skips append index into segment

Also related to milvus-io#39428

Signed-off-by: Congqi Xia <[email protected]>
sre-ci-robot pushed a commit that referenced this issue Jan 20, 2025
…9438)

Cherry pick from master
pr: #39437

Related to #39339
Previous PR #39389 only skips append index into segment

Also related to #39428

Signed-off-by: Congqi Xia <[email protected]>
sre-ci-robot pushed a commit that referenced this issue Jan 20, 2025
Related to #39339

Extra indexes can be ignored for most cases since sorted pk column
already provided indexing features

---------

Signed-off-by: Congqi Xia <[email protected]>
sre-ci-robot pushed a commit that referenced this issue Jan 20, 2025
Related to #39339
Previous PR #39389 only skips append index into segment

Also related to #39428

Signed-off-by: Congqi Xia <[email protected]>
@zhuwenxing
Copy link
Contributor Author

verified and fixed

@sevenold
Copy link

pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to Query on QueryNode 137: channel not available[channel=channel distribution is not serviceable])>

pymilvus:2.5.3

Image

Which version was the fix implemented in?
@zhuwenxing

@yanliang567
Copy link
Contributor

please retry on Milvus v2.5.4, which is going to be available in one day or two

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Issues or changes related a bug priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now. severity/critical Critical, lead to crash, data missing, wrong result, function totally doesn't work. triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

4 participants