我安装了带有LinuxAzure诊断扩展的linux VM,并配置为将syslog消息推送到事件中心。
我可以在事件中心进程数据边栏选项卡上查看我的系统日志消息。现在我正在尝试将这些日志发送到Azure数据资源管理器,为此我遵循了以下步骤
Syslog
)和表(Syslog Table
)用于存储syslog消息。一切顺利,没有任何错误,因为. show摄入失败
没有显示任何错误,但我无法在ADX表中看到任何数据。
以下是示例配置。
以Json格式从事件中心查看的示例数据
{
"time": "2020-05-18T15:54:01.0000000Z",
"resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
"properties": {
"ident": "systemd",
"Ignore": "syslog",
"Facility": "daemon",
"Severity": "info",
"EventTime": "2020-05-18T15:54:01.0000000",
"SendingHost": "localhost",
"Msg": "Removed slice User Slice of root.",
"hostname": "vmname",
"FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
},
"category": "daemon",
"level": "info",
"operationName": "LinuxSyslogEvent",
"EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
"PartitionId": 0,
"EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
}
ADX表架构
.create table SyslogTable (
eventTime: datetime,
resourceId: string,
properties: dynamic ,
category: string,
level: string,
operationName: string,
EventProcessedUtcTime: string,
PartitionId: int,
EventEnqueuedUtcTime: datetime
)
ADXSyslog表映射
.create table SyslogTable ingestion json mapping "SyslogMapping"
'['
' {"column":"eventTime", "Properties": {"Path": "$.time"}},'
' {"column":"resourceId", "Properties": {"Path":"$.resourceId"}},'
' {"column":"properties", "Properties": {"Path":"$.properties"}},'
' {"column":"category", "Properties": {"Path":"$.category"}},'
' {"column":"level", "Properties": {"Path": "$.level"}},'
' {"column":"operationName", "Properties": {"Path": "$.operationName"}},'
' {"column":"EventProcessedUtcTime", "Properties": {"Path": "$.EventProcessedUtcTime"}},'
' {"column":"PartitionId", "Properties": {"Path": "$.PartitionId"}},'
' {"column":"EventEnqueuedUtcTime", "Properties": {"Path": "$.EventEnqueuedUtcTime"}}'
']'
数据连接设置
Table: SyslogTable
Column Mapping: SyslogMapping
Data Format: Multiline Json/Json # tried with both
所以我错过了什么吗?
数据没有被推送到ADX表的问题是因为我在数据连接设置中定义了$Default
消费者组,并且我已经使用$Default
消费者组从EH其他地方获取事件。
因此,解决方案很简单,可以为事件中心创建一个新的消费者组并创建新的数据连接。
考虑到表模式和有效负载模式,您的摄取映射似乎没有问题。
例如,如果您运行此操作-您将看到数据被成功摄取
.ingest inline into table SyslogTable with(format=multijson, ingestionMappingReference='SyslogMapping') <|
{
"time": "2020-05-18T15:54:01.0000000Z",
"resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
"properties": {
"ident": "systemd",
"Ignore": "syslog",
"Facility": "daemon",
"Severity": "info",
"EventTime": "2020-05-18T15:54:01.0000000",
"SendingHost": "localhost",
"Msg": "Removed slice User Slice of root.",
"hostname": "vmname",
"FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
},
"category": "daemon",
"level": "info",
"operationName": "LinuxSyslogEvent",
"EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
"PartitionId": 0,
"EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
}
要解决您面临的问题,并假设您已经保证数据已成功推送到EventHub,我建议您通过azure门户为您的资源打开支持票证。