Compare commits

...

9 Commits

Author SHA1 Message Date
guozhenwei 7cac54cb88
docs: update WebSocket connection API support status and usage restrictions in documentation (#32393) 2025-08-01 13:29:29 +08:00
Simon Guan fe1658b79c
merge: from 3.3.6 to main branch (#32389)
* enh: grant support for data source ORC (#32378)

* fix: wrong col_id in ins_columns (#32358)

* test: remove un checked case (#32388)

* fix(parser): subquery use last_row can't found the colname (#32353)

* add col_id info in ins_columns

* fix: test case

---------

Co-authored-by: Kaili Xu <klxu@taosdata.com>
Co-authored-by: Tony Zhang <34825804+Tony2h@users.noreply.github.com>
Co-authored-by: hongzhenliu <wluckyjob@gmail.com>
Co-authored-by: Tony Zhang <tonyzhang@taosdata.com>
2025-08-01 10:04:15 +08:00
Zhixiao Bao 74a1397134
test: mute recalc manual case. (#32413) 2025-08-01 10:02:07 +08:00
kevin men fc410c3828
docs: escape_character parameter in document classification error (#32421) 2025-08-01 09:57:01 +08:00
haoranchen db89e67e03
docs: add blank line for better readability in product classification section (#32417) 2025-07-31 19:07:45 +08:00
WANG Xu 32b912b9f3
Merge pull request #32416 from taosdata/chore/wangxu/trigger-doc-build
chore: trigger doc build
2025-07-31 18:32:35 +08:00
WANG Xu 9f6a552d97
chore: trigger doc build
Signed-off-by: WANG Xu <feici02@outlook.com>
2025-07-31 18:31:57 +08:00
Simon Guan 6194b1d3e3
test: reproduce bug (#32414) 2025-07-31 17:53:47 +08:00
Haojun Liao bf2b65d9b5
fix(gpt): support scalar operation in forecast function. (#32409) 2025-07-31 17:27:34 +08:00
19 changed files with 1359 additions and 1383 deletions

View File

@ -216,8 +216,11 @@ In insertion scenarios, `filetype` must be set to `insert`. For this parameter a
- **keep_trying**: Number of retries after failure, default is no retry. Requires version v3.0.9 or above.
- **trying_interval**: Interval between retries in milliseconds, effective only when retries are specified in keep_trying. Requires version v3.0.9 or above.
- **childtable_from and childtable_to**: Specifies the range of child tables to write to, the interval is [childtable_from, childtable_to].
- **escape_character**: Whether the supertable and child table names contain escape characters, default is "no", options are "yes" or "no".
- **continue_if_fail**: Allows users to define behavior after failure
"continue_if_fail": "no", taosBenchmark exits automatically upon failure, default behavior
@ -244,8 +247,6 @@ Parameters related to supertable creation are configured in the `super_tables` s
- **childtable_prefix**: Prefix for child table names, mandatory, no default value.
- **escape_character**: Whether the supertable and child table names contain escape characters, default is "no", options are "yes" or "no".
- **auto_create_table**: Effective only when insert_mode is taosc, rest, stmt and child_table_exists is "no", "yes" means taosBenchmark will automatically create non-existent tables during data insertion; "no" means all tables are created in advance before insertion.
- **batch_create_tbl_num**: Number of tables created per batch during child table creation, default is 10. Note: The actual number of batches may not match this value, if the executed SQL statement exceeds the maximum supported length, it will be automatically truncated and executed, continuing the creation.

View File

@ -150,12 +150,12 @@ TDengine client driver supports WebSocket connection and native connection. Most
**WebSocket connection function difference description:**
The following APIs only return a success status in WebSocket connection mode, but do not perform actual operations:
- `taos_options_connection` - Connection option settings
- `taos_connect_auth` - MD5 encrypted password connection
- `taos_set_notify_cb` - Event callback function settings
- `tmq_get_connect` - Get TMQ connection handle
| API | Support Status | Interface Description | Usage Restrictions |
| ----------------------- | ------------------- | --------------------------------- | ------------------------------------------------------------------------------ |
| taos_connect_auth | Not supported | MD5 encrypted password connection | Only returns a success status, no actual operation is performed |
| taos_set_notify_cb | Not supported | Set an event callback function | Only returns a success status, no actual operation is performed |
| tmq_get_connect | Not supported | Get a TMQ connection handle | Only returns a success status, no actual operation is performed |
| taos_options_connection | Partially supported | Set client connection options | Character set settings are not supported, and the UTF-8 character set is fixed |
These APIs are fully functional in native connection mode. If you need to use the above functions, it is recommended to choose native connection mode. Future versions will gradually improve the functional support of WebSocket connection.

View File

@ -11,6 +11,7 @@ TDengine TSDB 核心是一款高性能、集群开源、云原生的时序数据
## TDengine TSDB 产品分类
TDengine 包括开源版 TDengine TSDB-OSS、企业版 TDengine TSDB-Enterprise 以及云服务 TDengine Cloud。
- TDengine TSDB-OSS 是一款开源、高性能、云原生的时序数据库,具有极强的弹性伸缩能力,同时带有内建的缓存、流式计算、数据订阅等功能,能大幅减少系统设计的复杂度,降低研发和运营成本,是一个极简的时序数据处理平台。更多细节请看 [TDengine TSDB-OSS](https://www.taosdata.com/tdengine-oss)。
- TDengine TSDB-Enterprise 是私有化部署的 TDengine TSDB可部署在边缘侧、本地、公有云/私有云上,具备开源版本所不具备的诸多企业版功能,细节请看 [TDengine TSDB-Enterprise](https://www.taosdata.com/tdengine-enterprise)。
- TDengine Cloud 是一个全托管的物联网、工业大数据云服务平台,尤其适合尤其适合中小规模的用户使用,细节请看 [TDengine Cloud](https://cloud.taosdata.com)。
@ -96,4 +97,4 @@ TDengine TSDB 既不依赖任何第三方软件,也不是优化或包装了一
- [TDengine 与 Cassandra 对比测试](https://www.taosdata.com/blog/2019/08/14/573.html)
- [TDengine VS InfluxDB写入性能大 PK](https://www.taosdata.com/2021/11/05/3248.html)
- [TDengine 和 InfluxDB 查询性能对比测试报告](https://www.taosdata.com/2022/02/22/5969.html)
- [TDengine 与 InfluxDB、OpenTSDB、Cassandra、MySQL、ClickHouse 等数据库的对比测试报告](https://www.taosdata.com/downloads/TDengine_Testing_Report_cn.pdf)
- [TDengine 与 InfluxDB、OpenTSDB、Cassandra、MySQL、ClickHouse 等数据库的对比测试报告](https://www.taosdata.com/downloads/TDengine_Testing_Report_cn.pdf)

View File

@ -119,8 +119,11 @@ taosBenchmark -f <json file>
- **keep_trying**:失败后进行重试的次数,默认不重试。需使用 v3.0.9 以上版本。
- **trying_interval**:失败重试间隔时间,单位为毫秒,仅在 keep_trying 指定重试后有效。需使用 v3.0.9 以上版本。
- **childtable_from 和 childtable_to**:指定写入子表范围,开闭区间为 [childtable_from, childtable_to] 。
 
- **escape_character**:超级表和子表名称中是否包含转义字符,默认值为 "no",可选值为 "yes" 或 "no" 。
- **continue_if_fail**:允许用户定义失败后行为。
“continue_if_fail”“no”失败 taosBenchmark 自动退出,默认行为。
@ -147,8 +150,6 @@ taosBenchmark -f <json file>
- **childtable_prefix**:子表名称的前缀,必选配置项,没有默认值。
- **escape_character**:超级表和子表名称中是否包含转义字符,默认值为 "no",可选值为 "yes" 或 "no" 。
- **auto_create_table**:仅当 insert_mode 为 taosc、rest、stmt 并且 child_table_exists 为 "no" 时生效,该参数为 "yes" 表示 taosBenchmark 在插入数据时会自动创建不存在的表;为 "no" 则表示先提前建好所有表再进行插入。
- **batch_create_tbl_num**:创建子表时每批次的建表数量,默认为 10。注实际的批数不一定与该值相同当执行的 SQL 语句大于支持的最大长度时,会自动截断再执行,继续创建。

View File

@ -148,12 +148,12 @@ TDengine 客户端驱动支持 WebSocket 连接和原生连接两种方式。大
**WebSocket 连接功能差异说明:**
以下 API 在 WebSocket 连接方式下暂时只返回成功状态,但不执行实际操作:
- `taos_options_connection` - 连接选项设置
- `taos_connect_auth` - MD5 加密密码连接
- `taos_set_notify_cb` - 事件回调函数设置
- `tmq_get_connect` - 获取 TMQ 连接句柄
| API | 支持状态 | 接口说明 | 使用限制 |
| ----------------------- | -------- | ------------------ | --------------------------------------- |
| taos_connect_auth | 不支持 | MD5 加密密码连接 | 只返回成功状态,不执行实际操作 |
| taos_set_notify_cb | 不支持 | 设置事件回调函数 | 只返回成功状态,不执行实际操作 |
| tmq_get_connect | 不支持 | 获取 TMQ 连接句柄 | 只返回成功状态,不执行实际操作 |
| taos_options_connection | 部分支持 | 设置客户端连接选项 | 不支持字符集设置,固定使用 UTF-8 字符集 |
这些 API 在原生连接方式下功能完整。如需使用上述功能,建议选择原生连接方式。未来版本将逐步完善 WebSocket 连接的功能支持。

View File

@ -249,6 +249,7 @@ static const SSysDbTableSchema userColsSchema[] = {
{.name = "col_scale", .bytes = 4, .type = TSDB_DATA_TYPE_INT, .sysInfo = false},
{.name = "col_nullable", .bytes = 4, .type = TSDB_DATA_TYPE_INT, .sysInfo = false},
{.name = "col_source", .bytes = TSDB_COL_FNAME_LEN - 1 + VARSTR_HEADER_SIZE, .type = TSDB_DATA_TYPE_VARCHAR, .sysInfo = false},
{.name = "col_id", .bytes = 2, .type = TSDB_DATA_TYPE_SMALLINT, .sysInfo = false},
};
static const SSysDbTableSchema userVctbColsSchema[] = {

View File

@ -3546,9 +3546,12 @@ static int32_t buildDbColsInfoBlock(const SSDataBlock *p, const SSysTableMeta *p
varDataSetLen(colTypeStr, colTypeLen);
TAOS_CHECK_GOTO(colDataSetVal(pColInfoData, numOfRows, (char *)colTypeStr, false), &lino, _OVER);
// col length
pColInfoData = taosArrayGet(p->pDataBlock, 5);
TAOS_CHECK_GOTO(colDataSetVal(pColInfoData, numOfRows, (const char *)&pm->schema[j].bytes, false), &lino, _OVER);
for (int32_t k = 6; k <= 9; ++k) {
// col precision, col scale, col nullable, col source
for (int32_t k = 6; k <= 10; ++k) {
pColInfoData = taosArrayGet(p->pDataBlock, k);
colDataSetNULL(pColInfoData, numOfRows);
}
@ -3711,13 +3714,21 @@ static int32_t mndRetrieveStbCol(SRpcMsg *pReq, SShowObj *pShow, SSDataBlock *pB
varDataSetLen(colTypeStr, colTypeLen);
RETRIEVE_CHECK_GOTO(colDataSetVal(pColInfo, numOfRows, (char *)colTypeStr, false), pStb, &lino, _OVER);
// col length
pColInfo = taosArrayGet(pBlock->pDataBlock, cols++);
RETRIEVE_CHECK_GOTO(colDataSetVal(pColInfo, numOfRows, (const char *)&pStb->pColumns[i].bytes, false), pStb,
&lino, _OVER);
while (cols < pShow->numOfColumns) {
pColInfo = taosArrayGet(pBlock->pDataBlock, cols++);
// col precision, col scale, col nullable, col source
for (int32_t j = 6; j <= 9; ++j) {
pColInfo = taosArrayGet(pBlock->pDataBlock, j);
colDataSetNULL(pColInfo, numOfRows);
}
// col id
pColInfo = taosArrayGet(pBlock->pDataBlock, 10);
RETRIEVE_CHECK_GOTO(colDataSetVal(pColInfo, numOfRows, (const char *)&pStb->pColumns[i].colId, false), pStb,
&lino, _OVER);
numOfRows++;
}

View File

@ -394,6 +394,7 @@ static int32_t forecastNext(SOperatorInfo* pOperator, SSDataBlock** ppRes) {
SForecastOperatorInfo* pInfo = pOperator->info;
SSDataBlock* pResBlock = pInfo->pRes;
SForecastSupp* pSupp = &pInfo->forecastSupp;
SExprSupp* pScalarSupp = &pInfo->scalarSup;
SAnalyticBuf* pBuf = &pSupp->analyBuf;
int64_t st = taosGetTimestampUs();
int32_t numOfBlocks = pSupp->numOfBlocks;
@ -407,6 +408,14 @@ static int32_t forecastNext(SOperatorInfo* pOperator, SSDataBlock** ppRes) {
break;
}
if (pScalarSupp->pExprInfo != NULL) {
code = projectApplyFunctions(pScalarSupp->pExprInfo, pBlock, pBlock, pScalarSupp->pCtx, pScalarSupp->numOfExprs,
NULL, GET_STM_RTINFO(pOperator->pTaskInfo));
if (code != TSDB_CODE_SUCCESS) {
T_LONG_JMP(pTaskInfo->env, code);
}
}
if (pSupp->groupId == 0 || pSupp->groupId == pBlock->info.id.groupId) {
pSupp->groupId = pBlock->info.id.groupId;
numOfBlocks++;

View File

@ -1492,11 +1492,13 @@ static int32_t sysTableUserColsFillOneTableCols(const SSysTableScanInfo* pInfo,
code = colDataSetVal(pColInfoData, numOfRows, (char*)colTypeStr, false);
QUERY_CHECK_CODE(code, lino, _end);
// col length
pColInfoData = taosArrayGet(dataBlock->pDataBlock, 5);
QUERY_CHECK_NULL(pColInfoData, code, lino, _end, terrno);
code = colDataSetVal(pColInfoData, numOfRows, (const char*)&schemaRow->pSchema[i].bytes, false);
QUERY_CHECK_CODE(code, lino, _end);
// col precision, col scale, col nullable
for (int32_t j = 6; j <= 8; ++j) {
pColInfoData = taosArrayGet(dataBlock->pDataBlock, j);
QUERY_CHECK_NULL(pColInfoData, code, lino, _end, terrno);
@ -1521,6 +1523,13 @@ static int32_t sysTableUserColsFillOneTableCols(const SSysTableScanInfo* pInfo,
code = colDataSetVal(pColInfoData, numOfRows, (char*)refColName, false);
QUERY_CHECK_CODE(code, lino, _end);
}
// col id
pColInfoData = taosArrayGet(dataBlock->pDataBlock, 10);
QUERY_CHECK_NULL(pColInfoData, code, lino, _end, terrno);
code = colDataSetVal(pColInfoData, numOfRows, (const char*)&schemaRow->pSchema[i].colId, false);
QUERY_CHECK_CODE(code, lino, _end);
++numOfRows;
}

View File

@ -3250,6 +3250,8 @@ static int32_t translateMultiResFunc(STranslateContext* pCxt, SFunctionNode* pFu
}
}
if (tsKeepColumnName && 1 == LIST_LENGTH(pFunc->pParameterList) && !pFunc->node.asAlias && !pFunc->node.asParam) {
tstrncpy(pFunc->node.aliasName, ((SExprNode*)nodesListGetNode(pFunc->pParameterList, 0))->aliasName,
TSDB_COL_NAME_LEN);
tstrncpy(pFunc->node.userAlias, ((SExprNode*)nodesListGetNode(pFunc->pParameterList, 0))->userAlias,
TSDB_COL_NAME_LEN);
}

View File

@ -172,7 +172,7 @@ taos> select table_name, db_name, columns, stable_name, type from information_sc
vtb_virtual_ntb8 | test_vtable_meta | 20 | NULL | VIRTUAL_NORMAL_TABLE |
vtb_virtual_ntb9 | test_vtable_meta | 20 | NULL | VIRTUAL_NORMAL_TABLE |
taos> select * from information_schema.ins_columns where table_type = 'VIRTUAL_NORMAL_TABLE' or table_type = 'VIRTUAL_CHILD_TABLE' order by table_name, col_name, table_type
taos> select table_name, db_name, table_type, col_name, col_type, col_length, col_precision, col_scale, col_nullable, col_source from information_schema.ins_columns where table_type = 'VIRTUAL_NORMAL_TABLE' or table_type = 'VIRTUAL_CHILD_TABLE' order by table_name, col_name, table_type
table_name | db_name | table_type | col_name | col_type | col_length | col_precision | col_scale | col_nullable | col_source |
==========================================================================================================================================================================================================================================================
vtb_virtual_ctb0 | test_vtable_meta | VIRTUAL_CHILD_TABLE | bigint_col | BIGINT | 8 | NULL | NULL | NULL | NULL |

View File

@ -13,4 +13,4 @@ describe test_vtable_meta.vtb_virtual_ctb0;
describe test_vtable_meta.vtb_virtual_ntb0;
select stable_name, db_name, columns, `tags`, isvirtual from information_schema.ins_stables where isvirtual = true order by stable_name;
select table_name, db_name, columns, stable_name, type from information_schema.ins_tables where type = 'VIRTUAL_NORMAL_TABLE' or type = 'VIRTUAL_CHILD_TABLE' order by table_name;
select * from information_schema.ins_columns where table_type = 'VIRTUAL_NORMAL_TABLE' or table_type = 'VIRTUAL_CHILD_TABLE' order by table_name, col_name, table_type;
select table_name, db_name, table_type, col_name, col_type, col_length, col_precision, col_scale, col_nullable, col_source from information_schema.ins_columns where table_type = 'VIRTUAL_NORMAL_TABLE' or table_type = 'VIRTUAL_CHILD_TABLE' order by table_name, col_name, table_type;

View File

@ -0,0 +1,46 @@
from new_test_framework.utils import tdLog, tdSql, sc, clusterComCheck
class TestLastRow:
def setup_class(cls):
tdLog.debug(f"start to execute {__file__}")
def test_last_row(self):
"""Last Row Sub Query Test
1.Create db
2.Create supper table and sub table
3.Insert data into sub table
4.Query last row from sub table as a sub query, it should return the last row data
Catalog:
- Query
Since: v3.0.0.0
Labels: common,ci
Jira: TS-6365
History:
- 2025-7-29 Ethan liu adds test for a sub query use last row
"""
tdLog.info(f"========== start sub query test")
tdSql.execute(f"drop database if exists test_sub_query")
tdSql.execute(f"create database test_sub_query")
tdSql.execute(f"use test_sub_query")
# create super table and sub table
tdSql.execute(f"create table super_t (ts timestamp, flag int) tags (t1 VARCHAR(10))")
tdSql.execute(f"create table sub_t0 using super_t tags('t1')")
tdSql.execute(f"insert into sub_t0 values (now, 0)")
tdSql.execute(f"alter local 'keepColumnName' '1'")
tdSql.execute(f"select flag from (select last_row(flag) from sub_t0) as t")
tdSql.checkRows(1)
tdLog.info(f"end sub query test successfully")

View File

@ -49,8 +49,8 @@ class TestStreamOldCaseCount:
# streams.append(self.Count31()) pass
# streams.append(self.Sliding01()) pass
# streams.append(self.Sliding02()) pass
streams.append(self.Sliding11())
# streams.append(self.Sliding21())
# streams.append(self.Sliding11()) pass
streams.append(self.Sliding21())
tdStream.checkAll(streams)
class Count01(StreamCheckItem):
@ -555,7 +555,7 @@ class TestStreamOldCaseCount:
f"create table t1(ts timestamp, a int, b int, c int, d double);"
)
tdSql.execute(
f"create stream streams1 trigger at_once IGNORE EXPIRED 1 IGNORE UPDATE 0 WATERMARK 100s into streamt as select _wstart as s, count(*) c1, sum(b), max(c) from t1 count_window(4, 2);"
f"create stream streams1 count_window(4, 2) from t1 stream_options(max_delay(3s)) into streamt as select _twstart as s, count(*) c1, sum(b), max(c) from %%trows;"
)
def insert1(self):
@ -616,7 +616,7 @@ class TestStreamOldCaseCount:
f"create table t1(ts timestamp, a int, b int, c int, d double);"
)
tdSql.execute(
f"create stream streams1 trigger at_once IGNORE EXPIRED 1 IGNORE UPDATE 0 WATERMARK 100s into streamt as select _wstart as s, count(*) c1, sum(b), max(c) from t1 count_window(4, 2);"
f"create stream streams1 count_window(4, 2) from t1 stream_options(max_delay(3s)) into streamt as select _twstart as s, _twend e, count(*) c1, sum(b), max(c) from %%trows;"
)
def insert1(self):
@ -633,10 +633,10 @@ class TestStreamOldCaseCount:
tdSql.checkResultsByFunc(
f"select * from streamt;",
lambda: tdSql.getRows() == 4
and tdSql.getData(0, 1) == 4
and tdSql.getData(1, 1) == 4
and tdSql.getData(2, 1) == 4
and tdSql.getData(3, 1) == 2,
and tdSql.getData(0, 2) == 4
and tdSql.getData(1, 2) == 4
and tdSql.getData(2, 2) == 4
and tdSql.getData(3, 2) == 2,
)
def insert2(self):
@ -646,20 +646,48 @@ class TestStreamOldCaseCount:
tdSql.checkResultsByFunc(
f"select * from streamt;",
lambda: tdSql.getRows() == 4
and tdSql.getData(0, 1) == 4
and tdSql.getData(1, 1) == 4
and tdSql.getData(2, 1) == 3
and tdSql.getData(3, 1) == 1,
and tdSql.getData(0, 2) == 4
and tdSql.getData(1, 2) == 4
and tdSql.getData(2, 2) == 4
and tdSql.getData(3, 2) == 2,
)
def insert3(self):
tdSql.execute(f"delete from t1 where ts = 1648791223002;")
# tdSql.pause()
tdSql.execute("delete from streamt where s > '2022-03-01 13:33:33.000'")
# tdSql.pause()
tdSql.execute(f"RECALCULATE STREAM streams1 from '2022-03-01 13:33:33.000';")
def check3(self):
tdSql.checkResultsByFunc(
f"select * from streamt;",
lambda: tdSql.getRows() == 3
and tdSql.getData(0, 1) == 4
and tdSql.getData(1, 1) == 4
and tdSql.getData(2, 1) == 2,
lambda: tdSql.getRows() == 4
and tdSql.getData(0, 2) == 4
and tdSql.getData(1, 2) == 4
and tdSql.getData(2, 2) == 3
and tdSql.getData(3, 2) == 1,
)
# def insert4(self):
# tdSql.execute(f"delete from t1 where ts = 1648791223002;")
# def check4(self):
# tdSql.checkResultsByFunc(
# f"select * from streamt;",
# lambda: tdSql.getRows() == 3
# and tdSql.getData(0, 1) == 4
# and tdSql.getData(1, 1) == 4
# and tdSql.getData(2, 1) == 2,
# )
# def insert5(self):
# tdSql.execute(f"RECALCULATE STREAM streams1 from '2022-03-01 13:33:33.000';")
# def check5(self):
# tdSql.checkResultsByFunc(
# f"select * from streamt;",
# lambda: tdSql.getRows() == 3
# and tdSql.getData(0, 1) == 4
# and tdSql.getData(1, 1) == 4
# and tdSql.getData(2, 1) == 2,
# )

View File

@ -34,7 +34,7 @@ class TestOdbc:
tdSql.checkData(5, 4, 8)
tdSql.query("desc information_schema.ins_columns")
tdSql.checkRows(10)
tdSql.checkRows(11)
tdSql.checkData(0, 0, "table_name")
tdSql.checkData(5, 0, "col_length")
tdSql.checkData(1, 2, 64)

View File

@ -340,6 +340,7 @@
## 08-SubQuery
,,y,.,./ci/pytest.sh pytest cases/07-DataQuerying/08-SubQuery/test_nestquery.py
,,y,.,./ci/pytest.sh pytest cases/07-DataQuerying/08-SubQuery/test_timeline.py
,,y,.,./ci/pytest.sh pytest cases/07-DataQuerying/08-SubQuery/test_last_row.py
## 09-SelectList
,,y,.,./ci/pytest.sh pytest cases/07-DataQuerying/09-SelectList/test_column_1.py
,,y,.,./ci/pytest.sh pytest cases/07-DataQuerying/09-SelectList/test_column_7.py
@ -577,8 +578,8 @@
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_delete_recalc.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_watermark.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_combined_options.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual_with_options.py
#,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual.py
#,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual_with_options.py
## 20-UseCase
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_idmp_meters.py

View File

@ -1,138 +0,0 @@
## 01-Snode
,,y,.,./ci/pytest.sh pytest cases/01-DataTypes/test_datatype_bigint.py
#,,y,.,./ci/pytest.sh pytest cases/02-Databases/01-Create/test_db_basic1.py
#,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_mgmt.py -N 8
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_mgmt_zlv.py -N 6 --replica 3
#,,n,.,pytest cases/13-StreamProcessing/01-Snode/snode_mgmt_zlv.py -N 6 --replica 3
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_mgmt.py -N 8
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_replicas.py -N 8
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_privileges.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_privileges_monitor_table.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_privileges_recalc.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_privileges_twodb.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_params_alter.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_params_alter_normaluser.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_params_check_default.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_params_check_maxValue.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/01-Snode/snode_params_check.py
## 02-Stream
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/02-Stream/stream_nosnode.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/02-Stream/stream_checkname.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/02-Stream/stream_long_name.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/02-Stream/stream_samename.py
## 03-TriggerMode
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_state.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_count.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_event.py
#,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_notify.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_fill_history.py
#,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_sliding.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/03-TriggerMode/test_window_close_state_window.py
,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_sliding.py
,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_state_new.py
#,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_state_disorderNupdate_new.py
#,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_count_new.py
,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_event_new.py
,,n,.,pytest cases/13-StreamProcessing/03-TriggerMode/test_period_1.py
## 04-Options
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options_vtbl.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options_abnormal.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options_abnormal_vtbl.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_meta.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_meta_vtbl.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_disorderUpdateDelete.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_disorderUpdateDelete_vtbl.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options_us.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/04-Options/test_options_ns.py
## 05-Notify
## 06-ResultSaved
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/06-ResultSaved/test_result_saved_comprehensive.py
## 07-SubQuery
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_basic.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_limit.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_count_1.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_count_2.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_event.py
#,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_period.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_sliding.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_session.py
,,n,.,pytest cases/13-StreamProcessing/07-SubQuery/test_subquery_state.py
## 08-Recalc
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_expired_time.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_ignore_disorder.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_delete_recalc.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_watermark.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_combined_options.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/08-Recalc/test_recalc_manual_with_options.py
## 20-UseCase
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_idmp_meters.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_idmp_vehicle.py
,,n,.,pytest cases/13-StreamProcessing/20-UseCase/test_idmp_meters_td36808.py
,,n,.,pytest cases/13-StreamProcessing/20-UseCase/test_idmp_tobacco.py
,,n,.,pytest cases/13-StreamProcessing/20-UseCase/test_idmp_pv.py
,,n,.,pytest cases/13-StreamProcessing/20-UseCase/test_nevados.py
,,n,.,pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_phase1.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_case4.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_case4_bug1.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_case5.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case1_bug1.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case1_twostream.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case3.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case4.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case6.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case17.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case18.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case19.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case19_bug1.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/20-UseCase/test_three_gorges_second_case22.py
## 21-Stability
## 22-Performance
## 23-Compatibility
,,n,.,pytest cases/13-StreamProcessing/23-Compatibility/stream_compatibility.py
## 30-OldPyCases
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_state_window.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_window_true_for.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_math_func.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_string_func.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_backquote_check.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_taosdShell.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_checkpoint_info.py -N 4
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_snode_restart_with_checkpoint.py -N 4
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_stream_multi_agg.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_stream_basic.py
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_compatibility_rolling_upgrade.py -N 3
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_compatibility_rolling_upgrade_all.py -N 3
,,n,.,pytest cases/13-StreamProcessing/30-OldPyCases/test_compatibility.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_drop.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_empty_identifier.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/30-OldPyCases/test_oldcase_at_once.py
## 31-OldCases
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_basic1.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_basic2.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_check.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_checkpoint.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_concat.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_continuewindowclose.py
,,n,.,pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_state.py
,,y,.,./ci/pytest.sh pytest cases/13-StreamProcessing/31-OldTsimCases/test_oldcase_twa.py
## 99-Others

File diff suppressed because it is too large Load Diff

View File

@ -2067,6 +2067,10 @@ static BArray *initChildCols(int colsSize) {
int prepareSampleData(SDataBase* database, SSuperTable* stbInfo) {
stbInfo->lenOfCols = accumulateRowLen(stbInfo->cols, stbInfo->iface);
stbInfo->lenOfTags = accumulateRowLen(stbInfo->tags, stbInfo->iface);
if (stbInfo->useTagTableName) {
// add tag table name length
stbInfo->lenOfTags += TSDB_TABLE_NAME_LEN + 1; // +1 for comma
}
if (stbInfo->partialColNum != 0
&& ((stbInfo->iface == TAOSC_IFACE
|| stbInfo->iface == REST_IFACE))) {