Basic Info
Link
Cognitive Graph for Multi-Hop Reading Comprehension at Scale
Keep curious to the world, and do cool things that matter.
DeepInf: Social Influence Prediction with Deep Learning
(a): Raw input which consists of a mini-batch of B instances; Each instance is a sub-network comprised n users who are sampled from the whole network. (b): An embedding layer which maps each user to a D-dimensional representation. (c): An Instance Normalization Layer which normalizes user’s embedding. (d): The formal input layer which concatenates together network embedding and other features. (e) A GCN and GAT Layer. (f) and (g) Compare model output and ground truth, we get the negative log likelihood loss. In this example, ego user v was finally activated (marked as black)
Given input dimension d, input data x, y can be expressed as →x=(x1,...,xd), →y=(y1,...,yd), an SVM kernel function k(→x,→y) can be expressed as the dot product of the transformation of →x, →y by a transformation function γ, that is to say
k(→x,→y)=γ(→x)⋅γ(→y)
Derive the transformation function γ for the following SVM kernels, also compute the VC Dimension for the SVM model based on these kernels.
根据多项式展开公式
k(→x,→y)=(→x⋅→y)n=(x1y1+x2y2+⋯+xdyd)n=∑n!n1!n2!⋯nd!(x1y1)n1(x2y2)n2⋯(xdyd)nd
所以
more >>an=n∑i=11i=1+12+...+1n=1+12+(13+14)+(15+16+17+18)+...>1+12+(14+14)+(18+18+18+18)+...+1n=1+12+12+...=1+m2
其中m是12的个数,随着n的增大而增大
所以,
limn→∞an=∞
举反例,如数列{1+12+...+1n}满足|an+p−an|≤pn,∀n,p∈N∗,但是其发散
∀n,p∈N∗
|an+p−an|≤|an+p−an+p+1|+|an+p−1−an+p−2|+...+|an+1−an|≤1(n+p−1)2+...+1n2≤1(n+p−1)(n+p−2)+...+1n(n−1)=1n−1−1n+p−1<1n−1
We take 0/1 classification problem for data with d dimensional features as an example. A neural network with one hidden layer can be written as
o=wT2σ(W1v+b1)+b2
where v is the d dimensional input feature of the data, while W1, w2, b1, b2 are the parameters of the neural network model. W1 is a n×d matrix, w2 is a n dimensional vector, and b1 is a n dimensional bias vector while b2 is the bias. When o>0 we classify the datum as label 1, while when o≤0 we classify it as label -1. This forms a neural network, or multi-layer-perceptron, with one hidden layer containing n neurons. In this problem, we focus on the pre-training case with frozen parameters that W1 and b1 must be decided when seeing all v without labels (l1,...li), while w2 and b2 can be decided after seeing all labels of examples (l1,...li)
Given n,d, calculate the VC dimension of the neural network for the linear activation case, i.e. σ(x)=x. Prove your result.
Given n,d, calculate the VC dimension of the neural network for the ReLU activation case, i.e. σ(x)=max(0,x). Prove your result.
Link: ImageNet Classification with Deep Convolutional Neural Networks - NIPS
Author:
作者的研究目标。
问题陈述,要解决什么问题?
可保证头文件只被包含一次,无论此命令在那个文件中出现了多少次
NSLog类似于C语言的printf(),添加了例如时间戳、日期戳和自动附加换行符等特性
NS前缀是来自Cocoa的函数,@符号表示引用的字符串应作为Cocoa的NSString元素来处理
NSArray提供数组,NSDateFormatter帮助用不同方式来格式化日期,NSThread提供多线程编程工具,NSSpeechSynthesizer使听到语音
使用NSLog()输出任意对象的值时,都会使用%@格式说明,使用这个说明符时,对象通过一个名为description的方法提供自己的NSLog()形式
相比于C语言的bool类型,OC提供BOOL类型,具有YES(1)值和NO(0)值,编译器将BOOL认作8位二进制数
more >>Link: arXiv:1311.2524
Author: Ross Girshick, Jeff Donahue, Trevor Darrell, Jitendra Malik
作者的研究目标。
问题陈述,要解决什么问题?
解决问题的方法/算法是什么?
more >>给定由d个属性描述的示例x=(x1;x2;…;xd),其中xi是x在第i个属性上的取值,线性模型试图学得一个通过属性的线性组合来进行预测的函数,即
f(x)=w1x1+w2x2+…+wdxd+b
一般用向量形式写成
f(x)=wTx+b
其中w=(w1;w2;…;wd)。w和b学得之后,模型就得以确定
给定数据集D=(x1,y1),(x2,y2),…,(xm,ym),其中xi=(xi1,xi2,…,xid),yi∈R,"线性回归"(linear regression)试图学得一个线性模型以尽可能准确地预测实值输出标记。
我们先考虑一种最简单的情形:输入属性的数目只有一个。为便于讨论,此时我们忽略关于属性的下标,即D={(xi,yi)}mi=1,其中xi∈R,对于离散属性,若属性值之间存在"序"(order)的关系,可通过连续化将其转化为连续值;若属性值间不存在序关系,假定有k个属性值,则通常转化为k维向量。
线性回归试图学得
more >>tag:
缺失模块。
1、请确保node版本大于6.2
2、在博客根目录(注意不是yilia根目录)执行以下命令:
npm i hexo-generator-json-content --save
3、在根目录_config.yml里添加配置:
jsonContent: meta: false pages: false posts: title: true date: true path: true text: false raw: false content: false slug: false updated: false comments: false link: false permalink: false excerpt: false categories: false tags: true
2024-08-19
#Operating System#Network#I/O
2024-08-19
#Algorithm#Encrypt
2024-08-19
2024-08-17
#Algorithm#LeetCode#Interview#Math#Dynamic Programming
2024-08-14
2024-07-28
#Homework#Study Record#Programing Language#Go
2024-07-25
#Homework#Study Record#Programing Language#Go
2024-07-23
#Homework#Study Record#Programing Language#Go
2024-07-22
#Homework#Study Record#Programing Language#Go
2021-04-18
#Homework#Software Architecture#GeekTime#Study Record#Wechat
2021-04-07
#Homework#Software Architecture#GeekTime#Study Record#Wechat
2021-03-13
#Programing
2021-02-07
#Algorithm#LeetCode#Interview#List
2021-02-05
2021-02-05
#Algorithm#LeetCode#Interview#Hash#Double Linked List
2021-01-30
#MySQL#Database
2021-01-26
#Python#Design Patterns#Magic Function#Decorater
2021-01-22
#Linux#Shell#Python#File Processing
2021-01-20
#Linux#Tensorflow Serving#Nvidia#CUDA#CUDNN#Inference Speed
2020-09-26
#Linux#Tensorflow Serving#Evernote#tools#system code
2020-07-25
#Homework#AI#Notes#Deep Learning#CogQA#Paper
2020-07-25
#Homework#AI#Notes#Deep Learning#Paper#GAN#Semi-supervised Learning
2020-07-18
#Homework#AI#Notes#Deep Learning#Paper#GCN#Social Influence#Social Network
2020-07-11
#Homework#Machine Learning#Kernel Function#SVM
2020-07-09
#Mathematics#Limit#Sequence Convergence#Calculus#Mathematical Analysis
2020-06-19
#Homework#Machine Learning#VC Dimension
2020-03-28
#AI#Notes#Deep Learning#Paper#Computer Vision#Object Detection#Comments
2019-11-06
2019-08-01
#AI#Notes#Deep Learning#Paper#Computer Vision#Object Detection#Comments
2019-05-30
#Machine Learning#AI#Notes
2019-03-04
#MySQL#Datawhale
2019-03-03
#MySQL#Datawhale
2019-03-02
#MySQL#Datawhale
2019-03-01
#MySQL#Datawhale
2019-02-27
#MySQL#Datawhale
2019-02-25
#MySQL#Datawhale
2019-01-09
2019-01-07
2019-01-05
2019-01-04
2018-12-23
#Hexo#LaTex
2018-12-23
#Machine Learning#AI#Notes
2018-12-15
#AI#Data Mining#Practice#Model Evaluation#Grid Search
2018-12-13
#AI#Data Mining#Practice#Model Evaluation
2018-12-12
#Algorithm#LeetCode#BinarySearch#TopologicalSort
2018-12-09
#AI#Data Mining#Practice#RandomForest#GBDT#XGBoost#LightGBM
2018-12-08
#SVM#AI#Data Mining#LogisticRegression#DecisionTree#Practice
2018-12-08
#Algorithm#LeetCode#BinarySearch
2018-12-05
#Algorithm#DataStructure#BinarryTree
2018-01-09
#MySQL#Database
2017-12-30
#Python#Translations#StackOverflow
2017-12-29
#Comments#Movie
2017-12-28
#Algorithm#PAT
2017-12-28
#Algorithm#PAT
2017-12-28
2017-09-30
#MySQL#Database
2017-06-20
2017-06-13
2017-05-26
2017-03-24
2017-03-23
2017-03-22