Skip to content

Commit a412d8e

Browse files
committed
handle-missing-value-II tried to knit couple times but failed, finally only use sample(length(fls), 1) which is only file but not 185 all files as sample
1 parent 8054aed commit a412d8e

File tree

8 files changed

+10424
-10377
lines changed

8 files changed

+10424
-10377
lines changed

Diff for: Handle-Missing-Value-II.Rmd

+29-4
Original file line numberDiff line numberDiff line change
@@ -101,12 +101,36 @@ rm(pkg, funs)
101101

102102
和之前的单变量一样,首先僕随机导入每分钟为1个时间单位的数据。
103103

104-
```{r warning=FALSE}
104+
```
105+
Error in optim(init[mask], getLike, method = "L-BFGS-B", lower = rep(0, : L-BFGS-B needs finite values of 'fn'
106+
17. optim(init[mask], getLike, method = "L-BFGS-B", lower = rep(0, np + 1L), upper = rep(Inf, np + 1L), control = optim.control)
107+
16. StructTS(data, ...)
108+
15. na.kalman(data, ...)
109+
14. apply.base.algorithm(data, algorithm = algorithm, ...)
110+
13. .f(.x[[i]], ...)
111+
12. map(., na.seadec, algorithm = x)
112+
11. function_list[[i]](value)
113+
10. freduce(value, `_function_list`)
114+
9. `_fseq`(`_lhs`)
115+
8. eval(quote(`_fseq`(`_lhs`)), env, env)
116+
7. eval(quote(`_fseq`(`_lhs`)), env, env)
117+
6. withVisible(eval(quote(`_fseq`(`_lhs`)), env, env))
118+
5. data_m1_NA %>% dplyr::select(starts_with("Ask"), starts_with("Bid")) %>% map(na.seadec, algorithm = x) %>% as.tibble
119+
4. FUN(X[[i]], ...)
120+
3. lapply(pieces, .fun, ...)
121+
2. structure(lapply(pieces, .fun, ...), dim = dim(pieces))
122+
1. llply(algo, function(x) { data_m1_NA %>% dplyr::select(starts_with("Ask"), starts_with("Bid")) %>% map(na.seadec, algorithm = x) %>% as.tibble })
123+
```
124+
125+
由于频频出现错误信息[#imputeTS/issues/26](https://github.com/SteffenMoritz/imputeTS/issues/26),于此僕使用sort(sample(length(fls), 1))随机筛选1个文件。
126+
127+
```{r warning=FALSE, message=FALSE}
105128
pth <- 'C:/Users/scibr/Documents/GitHub/scibrokes/real-time-fxcm/data/USDJPY/'
106129
fls <- list.files(pth, pattern = '^Y[0-9]{4}W[1-9]{1,2}_m1.rds$')
107130
108131
## 1分钟数据
109-
data_m1 <- llply(fls, function(x) {
132+
## 由于频频出现错误信息,于此僕使用sort(sample(length(fls), 1))随机筛选4个文件。
133+
data_m1 <- llply(fls[sort(sample(length(fls), 1))], function(x) {
110134
y <- readRDS(paste0(pth, x)) %>%
111135
dplyr::rename(index = DateTime) %>%
112136
mutate(index = index %>% mdy_hms %>%
@@ -146,12 +170,13 @@ data_m1 %>%
146170

147171
接着,导入Tick数据^[欲知更多详情,请参阅[一、什么是Tick Data](https://www.fmz.com/bbs-topic/457)],并且转为每分钟为1时间单位。
148172

149-
```{r, warning=FALSE}
173+
```{r, warning=FALSE, message=FALSE}
150174
pth <- 'C:/Users/scibr/Documents/GitHub/scibrokes/real-time-fxcm/data/USDJPY/'
151175
fls <- list.files(pth, pattern = '^Y[0-9]{4}W[1-9]{1,2}.rds$')
152176
153177
## Tick数据转为1分钟数据
154-
data_tm1 <- llply(fls, function(x) {
178+
## 由于频频出现错误信息,于此僕使用sort(sample(length(fls), 1))随机筛选2个文件。
179+
data_tm1 <- llply(fls[sort(sample(length(fls), 1))], function(x) {
155180
y <- readRDS(paste0(pth, x)) %>%
156181
convertOHLC(combine = TRUE)
157182

Diff for: Handle-Missing-Value-II.html

+10,204-10,253
Large diffs are not rendered by default.
-57.8 KB
Binary file not shown.
Binary file not shown.

Diff for: binary-Q1Inter-HFT.html

+73-41
Large diffs are not rendered by default.

Diff for: logs/log4j.spark.log

+39-78
Original file line numberDiff line numberDiff line change
@@ -1,78 +1,39 @@
1-
18/10/24 20:32:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2-
18/10/24 20:32:34 INFO SparkContext: Running Spark version 2.3.2
3-
18/10/24 20:32:34 INFO SparkContext: Submitted application: sparklyr
4-
18/10/24 20:32:34 INFO SecurityManager: Changing view acls to: scibr
5-
18/10/24 20:32:34 INFO SecurityManager: Changing modify acls to: scibr
6-
18/10/24 20:32:34 INFO SecurityManager: Changing view acls groups to:
7-
18/10/24 20:32:34 INFO SecurityManager: Changing modify acls groups to:
8-
18/10/24 20:32:34 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(scibr); groups with view permissions: Set(); users with modify permissions: Set(scibr); groups with modify permissions: Set()
9-
18/10/24 20:32:35 INFO Utils: Successfully started service 'sparkDriver' on port 50109.
10-
18/10/24 20:32:35 INFO SparkEnv: Registering MapOutputTracker
11-
18/10/24 20:32:35 INFO SparkEnv: Registering BlockManagerMaster
12-
18/10/24 20:32:35 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
13-
18/10/24 20:32:35 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
14-
18/10/24 20:32:35 INFO DiskBlockManager: Created local directory at C:\Users\scibr\AppData\Local\Temp\blockmgr-42837de4-0ea7-4343-9942-202186a58ec8
15-
18/10/24 20:32:35 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
16-
18/10/24 20:32:36 INFO SparkEnv: Registering OutputCommitCoordinator
17-
18/10/24 20:32:36 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18-
18/10/24 20:32:36 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://127.0.0.1:4040
19-
18/10/24 20:32:36 INFO SparkContext: Added JAR file:/C:/Users/scibr/Documents/R/win-library/3.5/sparklyr/java/sparklyr-2.3-2.11.jar at spark://127.0.0.1:50109/jars/sparklyr-2.3-2.11.jar with timestamp 1540380756770
20-
18/10/24 20:32:36 INFO Executor: Starting executor ID driver on host localhost
21-
18/10/24 20:32:37 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50131.
22-
18/10/24 20:32:37 INFO NettyBlockTransferService: Server created on 127.0.0.1:50131
23-
18/10/24 20:32:37 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
24-
18/10/24 20:32:37 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 127.0.0.1, 50131, None)
25-
18/10/24 20:32:37 INFO BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:50131 with 366.3 MB RAM, BlockManagerId(driver, 127.0.0.1, 50131, None)
26-
18/10/24 20:32:37 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 127.0.0.1, 50131, None)
27-
18/10/24 20:32:37 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 127.0.0.1, 50131, None)
28-
18/10/24 20:32:38 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
29-
18/10/24 20:32:50 INFO SparkContext: Invoking stop() from shutdown hook
30-
18/10/24 20:32:50 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
31-
18/10/24 20:32:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
32-
18/10/24 20:32:50 INFO MemoryStore: MemoryStore cleared
33-
18/10/24 20:32:50 INFO BlockManager: BlockManager stopped
34-
18/10/24 20:32:50 INFO BlockManagerMaster: BlockManagerMaster stopped
35-
18/10/24 20:32:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
36-
18/10/24 20:32:50 INFO SparkContext: Successfully stopped SparkContext
37-
18/10/24 20:32:50 INFO ShutdownHookManager: Shutdown hook called
38-
18/10/24 20:32:50 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-5e26a35b-7700-4fe5-b4cd-9045c7253e3a
39-
18/10/24 20:32:50 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-6b3b50ff-32dc-4256-af6f-adbe92b866ff
40-
18/10/24 20:37:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
41-
18/10/24 20:37:32 INFO SparkContext: Running Spark version 2.3.2
42-
18/10/24 20:37:32 INFO SparkContext: Submitted application: sparklyr
43-
18/10/24 20:37:32 INFO SecurityManager: Changing view acls to: scibr
44-
18/10/24 20:37:32 INFO SecurityManager: Changing modify acls to: scibr
45-
18/10/24 20:37:32 INFO SecurityManager: Changing view acls groups to:
46-
18/10/24 20:37:32 INFO SecurityManager: Changing modify acls groups to:
47-
18/10/24 20:37:32 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(scibr); groups with view permissions: Set(); users with modify permissions: Set(scibr); groups with modify permissions: Set()
48-
18/10/24 20:37:32 INFO Utils: Successfully started service 'sparkDriver' on port 50206.
49-
18/10/24 20:37:32 INFO SparkEnv: Registering MapOutputTracker
50-
18/10/24 20:37:32 INFO SparkEnv: Registering BlockManagerMaster
51-
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
52-
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
53-
18/10/24 20:37:32 INFO DiskBlockManager: Created local directory at C:\Users\scibr\AppData\Local\Temp\blockmgr-9b3cbfbd-5baa-4af8-b300-189da3dcae01
54-
18/10/24 20:37:32 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
55-
18/10/24 20:37:32 INFO SparkEnv: Registering OutputCommitCoordinator
56-
18/10/24 20:37:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
57-
18/10/24 20:37:32 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://127.0.0.1:4040
58-
18/10/24 20:37:32 INFO SparkContext: Added JAR file:/C:/Users/scibr/Documents/R/win-library/3.5/sparklyr/java/sparklyr-2.3-2.11.jar at spark://127.0.0.1:50206/jars/sparklyr-2.3-2.11.jar with timestamp 1540381052772
59-
18/10/24 20:37:32 INFO Executor: Starting executor ID driver on host localhost
60-
18/10/24 20:37:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50227.
61-
18/10/24 20:37:32 INFO NettyBlockTransferService: Server created on 127.0.0.1:50227
62-
18/10/24 20:37:32 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
63-
18/10/24 20:37:32 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 127.0.0.1, 50227, None)
64-
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:50227 with 366.3 MB RAM, BlockManagerId(driver, 127.0.0.1, 50227, None)
65-
18/10/24 20:37:32 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 127.0.0.1, 50227, None)
66-
18/10/24 20:37:32 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 127.0.0.1, 50227, None)
67-
18/10/24 20:37:33 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
68-
18/10/24 20:37:36 INFO SparkContext: Invoking stop() from shutdown hook
69-
18/10/24 20:37:36 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
70-
18/10/24 20:37:36 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
71-
18/10/24 20:37:36 INFO MemoryStore: MemoryStore cleared
72-
18/10/24 20:37:36 INFO BlockManager: BlockManager stopped
73-
18/10/24 20:37:36 INFO BlockManagerMaster: BlockManagerMaster stopped
74-
18/10/24 20:37:36 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
75-
18/10/24 20:37:36 INFO SparkContext: Successfully stopped SparkContext
76-
18/10/24 20:37:36 INFO ShutdownHookManager: Shutdown hook called
77-
18/10/24 20:37:36 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-4f2683b7-49df-4b71-8096-73acfb00a16e
78-
18/10/24 20:37:36 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-48587667-83df-4a73-8fe8-94189b1125cf
1+
18/10/25 22:21:51 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2+
18/10/25 22:21:58 INFO SparkContext: Running Spark version 2.3.2
3+
18/10/25 22:21:58 INFO SparkContext: Submitted application: sparklyr
4+
18/10/25 22:21:59 INFO SecurityManager: Changing view acls to: scibr
5+
18/10/25 22:21:59 INFO SecurityManager: Changing modify acls to: scibr
6+
18/10/25 22:21:59 INFO SecurityManager: Changing view acls groups to:
7+
18/10/25 22:21:59 INFO SecurityManager: Changing modify acls groups to:
8+
18/10/25 22:21:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(scibr); groups with view permissions: Set(); users with modify permissions: Set(scibr); groups with modify permissions: Set()
9+
18/10/25 22:22:00 INFO Utils: Successfully started service 'sparkDriver' on port 64894.
10+
18/10/25 22:22:00 INFO SparkEnv: Registering MapOutputTracker
11+
18/10/25 22:22:01 INFO SparkEnv: Registering BlockManagerMaster
12+
18/10/25 22:22:01 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
13+
18/10/25 22:22:01 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
14+
18/10/25 22:22:01 INFO DiskBlockManager: Created local directory at C:\Users\scibr\AppData\Local\Temp\blockmgr-fdf4f087-5eba-4dfb-b468-a2354946e522
15+
18/10/25 22:22:01 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
16+
18/10/25 22:22:01 INFO SparkEnv: Registering OutputCommitCoordinator
17+
18/10/25 22:22:01 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18+
18/10/25 22:22:02 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://127.0.0.1:4040
19+
18/10/25 22:22:02 INFO SparkContext: Added JAR file:/C:/Users/scibr/Documents/R/win-library/3.5/sparklyr/java/sparklyr-2.3-2.11.jar at spark://127.0.0.1:64894/jars/sparklyr-2.3-2.11.jar with timestamp 1540473722859
20+
18/10/25 22:22:03 INFO Executor: Starting executor ID driver on host localhost
21+
18/10/25 22:22:03 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64925.
22+
18/10/25 22:22:03 INFO NettyBlockTransferService: Server created on 127.0.0.1:64925
23+
18/10/25 22:22:03 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
24+
18/10/25 22:22:03 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 127.0.0.1, 64925, None)
25+
18/10/25 22:22:03 INFO BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:64925 with 366.3 MB RAM, BlockManagerId(driver, 127.0.0.1, 64925, None)
26+
18/10/25 22:22:03 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 127.0.0.1, 64925, None)
27+
18/10/25 22:22:03 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 127.0.0.1, 64925, None)
28+
18/10/25 22:22:04 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
29+
18/10/25 22:26:03 INFO SparkContext: Invoking stop() from shutdown hook
30+
18/10/25 22:26:03 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
31+
18/10/25 22:26:04 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
32+
18/10/25 22:26:04 INFO MemoryStore: MemoryStore cleared
33+
18/10/25 22:26:04 INFO BlockManager: BlockManager stopped
34+
18/10/25 22:26:04 INFO BlockManagerMaster: BlockManagerMaster stopped
35+
18/10/25 22:26:04 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
36+
18/10/25 22:26:04 INFO SparkContext: Successfully stopped SparkContext
37+
18/10/25 22:26:04 INFO ShutdownHookManager: Shutdown hook called
38+
18/10/25 22:26:04 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-3d2476e8-964f-4158-aa31-45e86f0d4e8d
39+
18/10/25 22:26:04 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-ae3b3080-7176-4f04-b746-7f19be926ef4

0 commit comments

Comments
 (0)