阿里的示例:
https://help.aliyun.com/document_detail/28994.html
使用系统时间作为日志时间上传日志样例10.116.14.201,-,2/25/2016,11:53:17,W3SVC7,2132,200,0,GET,project/shenzhen-test/logstore/logstash/detail,C:\test\csv\test_csv.log采集配置input { file { type => "csv_log_1" path => ["C:/test/csv/*.log"] start_position => "beginning" }}filter { if [type] == "csv_log_1" { csv { separator => "," columns => ["ip", "a", "date", "time", "b", "latency", "status", "size", "method", "url", "file"] } }}output { if [type] == "csv_log_1" { logservice { codec => "json" endpoint => "***" project => "***" logstore => "***" topic => "" source => "" access_key_id => "***" access_key_secret => "***" max_send_retry => 10 } }}注意:配置文件格式必须以 UTF-8 无 BOM 格式编码,可以下载 notepad++ 修改文件编码格式。path 填写文件路径时请使用 UNIX 模式的分隔符,如:C:/test/multiline/*.log,否则无法支持模糊匹配。type 字段需要统一修改并在该文件内保持一致,如果单台机器存在多个Logstash配置文件,需要保证各配置 type 字段唯一,否则会导致数据处理的错乱。相关插件:file、csv。重启Logstash生效创建配置文件到 conf 目录,参考配置Logstash重启Logstash生效。使用日志字段内容作为日志时间上传日志样例10.116.14.201,-,Feb 25 2016 14:03:44,W3SVC7,1332,200,0,GET,project/shenzhen-test/logstore/logstash/detail,C:\test\csv\test_csv_withtime.log采集配置input { file { type => "csv_log_2" path => ["C:/test/csv_withtime/*.log"] start_position => "beginning" }}filter { if [type] == "csv_log_2" { csv { separator => "," columns => ["ip", "a", "datetime", "b", "latency", "status", "size", "method", "url", "file"] } date { match => [ "datetime" , "MMM dd YYYY HH:mm:ss" ] } }}output { if [type] == "csv_log_2" { logservice { codec => "json" endpoint => "***" project => "***" logstore => "***" topic => "" source => "" access_key_id => "***" access_key_secret => "***" max_send_retry => 10 } }}注意:配置文件格式必须以 UTF-8 无 BOM 格式编码,可以下载 notepad++ 修改文件编码格式。path 填写文件路径时请使用 UNIX 模式的分隔符,如:C:/test/multiline/*.log,否则无法支持模糊匹配。type 字段需要统一修改并在该文件内保持一致,如果单台机器存在多个Logstash配置文件,需要保证各配置 type 字段唯一,否则会导致数据处理的错乱。
字段:
"Version","CsvFileFormatVersion","OutputInfo","OutputInfoFaultComponentInfo","OutputInfoPinInfo","InspectionMachine","InspectionProcess","ProgramName ","BothSideCode","CreateMachine","LibraryName","ReferencePosition","PcbSizeHeight","PcbSizeWidth","RailWidth","SavedDate","PcbNo","TestTime","CreateDate","PersonRevisor","PersonRevisorCode","ReviseTime","ReviseEndDate","TestResult","ReviseResult","OverlookFault","LotCount","Barcode","FaultRate","ComponentTotal","PinTotal","LandTotal","OutComponentTotal","VisualFaultFlag","RevisorMachineId","RevisorComputerName","Barcode","BothSideCode","PcbNo","ComponentBlockNo","ComponentBlockName","ComponentNo","PartsName","PartsTypeNo","PartsVarNo","PartsVarName","PartsArticleNo","PinNo","PinSpaceNo","FaultCode","RevisedFaultId","XShift","YShift","AngleMount","ComponentReviseEndDate","OutComponentFlag","FaultLandCount","Grid","ComponentPersonRevisor","ComponentReviseTime","ComponentReviseMachineId","ComponentReviserComputerName"
input { file { path => ["/opt/logstash-6.0.0/config/csv/*.csv"] start_position => "beginning" }}filter { csv { separator => "," columns => ["Version","CsvFileFormatVersion","OutputInfo","OutputInfoFaultComponentInfo","OutputInfoPinInfo","InspectionMachine","InspectionProcess","ProgramName ","BothSideCode","CreateMachine","LibraryName","ReferencePosition","PcbSizeHeight","PcbSizeWidth","RailWidth","SavedDate","PcbNo","TestTime","CreateDate","PersonRevisor","PersonRevisorCode","ReviseTime","ReviseEndDate","TestResult","ReviseResult","OverlookFault","LotCount","Barcode","FaultRate","ComponentTotal","PinTotal","LandTotal","OutComponentTotal","VisualFaultFlag","RevisorMachineId","RevisorComputerName","Barcode","BothSideCode","PcbNo","ComponentBlockNo","ComponentBlockName","ComponentNo","PartsName","PartsTypeNo","PartsVarNo","PartsVarName","PartsArticleNo","PinNo","PinSpaceNo","FaultCode","RevisedFaultId","XShift","YShift","AngleMount","ComponentReviseEndDate","OutComponentFlag","FaultLandCount","Grid","ComponentPersonRevisor","ComponentReviseTime","ComponentReviseMachineId","ComponentReviserComputerName"] }}output { stdout{ codec=>rubydebug }}~