Tuesday, November 14, 2017

Simple Baysian Neural Network with Edward


Edward can enable us to convert TensorFlow code to Baysian one. I’m not used to Edward. So for the training, I’m tackling with converting some TensorFlow code to Edward one. On this article, I tried to convert simple neural network model to Baysian neural network one.

The purpose of this article is to convert the TensorFlow code I posted before to Baysian one by Edward.

Baysian neural network model

By Edward, we can relatively easily convert the model using TensorFlow to probabilistic one.
The regression model for iris data is from the article below.

Simple regression model by TensorFlow

Neural network is composed of input, hidden and output layers. And the number of hidden layers is optional. So the simplest network architecture has just one hidden layer. On this article, I'll make the simplest neural network for regression by TensorFlow.

In a nutshell, the model is to predict one target value from three features. About the details, please check the article.

Friday, November 10, 2017

Simple regression by Edward: variational inference


Edward is one of the PPL(probabilistic programming language). This enables us to use variational inference, Gibbs sampling and Monte Carlo method relatively easily. But it doesn’t look so easy. So step by step, I’ll try this.

On this article, simple regression, tried on the article Simple Bayesian modeling by Stan, can be the nice example. So I did same things by Edward, using variational inference.

Monday, October 23, 2017

How to complement missing values in data on Python

As data pre-processing, we frequently need to deal with missing values. There are some ways to deal with those and one of them is to complement those by representative values.

On Python, by scikit-learn, we can do it.
I'll use air quality data to try it.

To prepare the data, on R console, execute the following code on your working directory.

write.csv(airquality, "airquality.csv", row.names=FALSE)

Friday, October 20, 2017

Wednesday, October 18, 2017

Image generator of Keras: to make neural network with little data

Keras has image generator which works well when we don’t have enough amount of data. I’ll try this by simple example.


To make nice neural network model about images, we need much amount of data. In many cases, the shortage of data can be one of the big obstacles for goodness.
Keras has image generator and it can solves the problem.

Monday, October 16, 2017

InceptionV3 Fine-tuning model: the architecture and how to make


InceptionV3 is one of the models to classify images. We can easily use it from TensorFlow or Keras.
On this article, I’ll check the architecture of it and try to make fine-tuning model.

There are some image classification models we can use for fine-tuning.
Those model’s weights are already trained and by small steps, you can make models for your own data.

About the fine-tuning itself, please check the followings.

Or TensorFlow and Keras have nice documents of fine-tuning.

From TensorFlow
From Keras

Sunday, October 15, 2017

How to interpret the summary of linear regression with log-transformed variable

How should we interpret the coefficients of linear regression when we use log-transformation?

On the area of econometrics and data science, we sometimes use log-transformed weights for linear regression. Usually, one of the advantages of linear regression is that we can easily interpret the outcome. But by log-transformation, how should we interpret the outcome?


In many cases, we adopt linear regression to analyze data. That lets us understand how influential each feature is.

So when we use it, to make the way of interpretation easy, we want as simple features as possible. If you transform the features, you need to adjust your interpretation to that.

Friday, October 13, 2017

I got started with JupyterLab

I just got started with JupyterLab.

From the official page, JupyterLab is
An extensible environment for interactive and reproducible computing, based on the Jupyter Notebook and Architecture.

As you know, Jupyter is very useful tool. To use it efficiently directly improves your efficiency of work. JupyterLab showed the possibility at least to me.

Thursday, October 12, 2017

Hierarchical Bayesian model's parameter Interpretation on Stan

Usually, Hierarchical Bayesian model has many parameters. So apparently, the interception to the sampled point’s statistical information looks complex.

On the article below, I made a Hierarchical Bayesian model to the artificial data. Here, by using almost same but simpler data, I’ll make a model and try to interpret.

Hierarchical Bayesian model by Stan: Struggling

I'll try to make Hierarchical Bayesian model to the artificial data by Stan. Hierarchical Bayesian model lets us write the model with a high degree of freedom.

Wednesday, October 11, 2017

Hierarchical Bayesian model by Stan: Struggling

I’ll try to make Hierarchical Bayesian model to the artificial data by Stan. Hierarchical Bayesian model lets us write the model with a high degree of freedom.

From Wikipedia,
Bayesian hierarchical modelling is a statistical model written in multiple levels (hierarchical form) that estimates the parameters of the posterior distribution using the Bayesian method.[1] The sub-models combine to form the hierarchical model, and the Bayes’ theorem is used to integrate them with the observed data, and account for all the uncertainty that is present. The result of this integration is the posterior distribution, also known as the updated probability estimate, as additional evidence on the prior distribution is acquired.