[TIL][Kubernetes] How to move your GKE into minikube

Preface As a cloud company, we usually build our container cluster on cloud platforms such as GCP or AWS. but sometimes we need to think about how to take to go solution to our customer. So, here comes a challenge - Scale-in our cloud platform and put into pocket (mmm I mean “minikube”) An example GKE service Here is a simple example which you (might) build in GKE. DB: MongoDB with stateful set binding with google persistent disk. Use stateful set from (“Running MongoDB on Kubernetes with StatefulSets”) as an example Web service(fooBar): a golang application which accesses mongo DB with the load balancer. Because fooBar is proprietary application which fooBar image store in GCR (Google Cloud Registry) not in docker hub. How to migrate your service from GKE to minikube: We will just list some note to let you know any tip or note you migrate your service to...
繼續閱讀

[TIL][SMACK] Install and run Kafka in Mac OSX

Why not Kafka 0.11 or 1.0 homebrew kafka version using 0.11 and could not launch on my computer, and it is hard to know detail why homebrew/kafka failed. (issue) Transactional Coordinator still not support by sarama golang client (golang) (issue) Install Kafka 0.8 manually in 2017/11 sudo su - cd /tmp wget https://archive.apache.org/dist/kafka/0.8.2.2/kafka_2.9.1-0.8.2.2.tgz tar -zxvf kafka_2.9.1-0.8.2.2.tgz -C /usr/local/ cd /usr/local/kafka_2.9.1-0.8.2.2 sbt update sbt package cd /usr/local ln -s kafka_2.9.1-0.8.2.2 kafka echo "" >> ~/.bash_profile echo "" >> ~/.bash_profile echo "# KAFKA" >> ~/.bash_profile echo "export KAFKA_HOME=/usr/local/kafka" >> ~/.bash_profile source ~/.bash_profile echo "export KAFKA=$KAFKA_HOME/bin" >> ~/.bash_profile echo "export KAFKA_CONFIG=$KAFKA_HOME/config" >> ~/.bash_profile source ~/.bash_profile $KAFKA/zookeeper-server-start.sh $KAFKA_CONFIG/zookeeper.properties $KAFKA/kafka-server-start.sh $KAFKA_CONFIG/server.properties How to verify your installation? (in your kafka path) Create topic > $KAFKA/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test` Verify it > $KAFKA/in/kafka-topics.sh --list --zookeeper localhost:2181 test Or your can use golang client to verify it: Console consumer Console producer Troubleshooting...
繼續閱讀

[TIL][Golang] Basic usage of cobra

spf13/cobra is a great package if you want to write your own console app. Even Kubernetes console use cobra to develop it console app. Create simple CLI app example. Let’s use kubectl as simple example it support. kubectl get nodes kubectl create -f ...RESOURCE kubectl delete -f ...RESOURCE Create sub-command using Cobra Take those command as an example, there are some sub-command as follow: get create delete Here is how we add that sub-command to your app. (ex: kctl) cobra init in your repo. It will create /cmd and main.go. cobra add get to add sub-command get now, you can try kctl get to get prompt from the console which you already call this sub-command. Repeatedly for create and delete. You will see related help in kctl --help. Add nested-command using Cobra Cobra could use console mode to add sub-command, but you need to add nested command manually. ex: we...
繼續閱讀

[TIL] Effective way for git rebase

心得: 這篇文章是講解到關於 git rebase 的部分,講著就提到一些 git 技巧.也提到該如何把不小心 commit 的檔案從 git 記錄內徹底刪除的方法. Rebase Manually auto-rebase Before start to rebase: git fetch origin Pull latest code and start rebase: (ex: rebase to master) git rebase origin/master Start rebase git rebase --continue … modify code. git add your_changed_code Force push because tree different: git push -f -u origin HEAD -u: for upstream sort term. -f: force (because you rebase your code history) Interactive rebasing Rebase with interactive git rebase -i (interactive) origin/develop It will entry Select your change and make as squash, pick. Check all commits. git log stat Reset git reset HEAD~ (rollback last change) git log --stat --decorate Here is detail example how to rebase from evan/test1 to develop git checkout -t origin/evan/test1 git log --stat --decorate git fetch origin git rebase -i origin/develop vim .gitignore git add -v .gitignore git rebase --continue git status git submodule update git log --stat...
繼續閱讀

[Coursera] Deep Learning Specialization: Neural Networks and Deep Learning (三)

總算完成 deeplearning.ai 第一階段課程 “Neural Networks and Deep Learning” 真的相當有趣的基礎課程,基本上上完了就等於把o’reilly deep learning 的整本書都上完.並且有實際透過 numpy 寫完部分的 DNN 的分類器的作業. 起源 本來就想把 Deep Learning 學一下, 因緣際會下看到這一篇 Coursera 學習心得 試讀了七天,除了提供 Jupyter Notebook 之外,作業也都相當有趣,就開始繼續學了. 目前進度到 Week2 相當推薦有程式設計一點點基礎就可以來學.裡面的數學應該還好. 學習的過程中還可以學會 Python 裡面的 numpy 如何使用,因為裡面主要就是要教導你如何使用 numpy 來兜出 Neural Network . 課程鏈結: 這裡 學習鏈結: Week 1-2: Introduction to deep learning & Neural Networks Basics Week 3: Shallow neural networks Week 4: Deep Neural Networks 課程內容: 第四週: Deep Neural Networks 基本符號解釋: Deep Neural Network 的 Layer 數,不包括輸入層.有包括隱藏曾與輸出層. \(N^[l]\) 代表第幾層裡面的個數. \(X\) (輸入層) 通常也可以表示成 \(a^[0]\) 那麼簡單的式子可以表達成以下的方式: \[Z^{[1]} = W^{[1]} * X + b^{[1]} \\ a^{[1]} = g^{[1]} * (Z^{[1]}) \\ Z^{[2]} = W^{[2]} * a^{[1]} + b^{[2]} \\ a^{[2]} = g^{[2]}*(Z^{[2]})\] … 其中別忘記 \(X -> a^{[0]}\) 透過這樣,可以簡化成: \[Z^{[l]} = W^{[l]} \\ a^{[l-1]} + b^{[l]} \\ a^{[l]} = g^{[l]}(Z^{[2]})\] Where \(l = 1, 2, ... L\) Hyperparameters 用來決定 \(w\) 與 \(b\) 的都算是 hyperparameter ,舉凡: Learning rate Hidden layer and hidden Unit Choice of activation...
繼續閱讀

[Coursera] Deep Learning Specialization: Neural Networks and Deep Learning (二)

第三週拖得有點久,因為被抓去專心寫 code 啦 (吃手手 第三週的課程其實很有趣,學到了 back-propagation 也學到了一些 NN 的技巧 如何有效設置初始權重 Activation function 該如何挑選 當然 Jupyter Notebook 也相當的好玩啊… 最後的大師採訪,是訪問 GAN 的發明者 Ian Goodfellow 起源 本來就想把 Deep Learning 學一下, 因緣際會下看到這一篇 Coursera 學習心得 試讀了七天,除了提供 Jupyter Notebook 之外,作業也都相當有趣,就開始繼續學了. 目前進度到 Week2 相當推薦有程式設計一點點基礎就可以來學.裡面的數學應該還好. 學習的過程中還可以學會 Python 裡面的 numpy 如何使用,因為裡面主要就是要教導你如何使用 numpy 來兜出 Neural Network . 課程鏈結: 這裡 學習鏈結: Week 1-2: Introduction to deep learning & Neural Networks Basics Week 3: Shallow neural networks Week 4: Deep Neural Networks 課程內容: 第三週: Shallow neural networks 關於多個 Neural Network 的表達方式: 第一層: \(z^[1] = w^[1]X+b^[1]\) 第一層輸出: \(a^[1]= sigmoid(z^[1])\) 第二層: \(z^[2] = w^[2]a^[1]+b^[2]\) 第二層輸出: \(a^[2]= sigmoid(z^[2])\) 下標代表的是第幾個 Neuron 不同的輸入訓練資料 $ 1 … m $$ for i = range(1, m): #第一層 #第一層輸出 #第二層 #第二層輸出 關於 Activation Function 的選用部分 這邊有篇文章很推薦”26种神经网络激活函数可视化“對於 Activation Function 有很多的著墨,在此筆記一下: Sigmoid 對於二元分類監督是學習(也就是學習是不是某種物品,比如說是不是貓) 相當的有用.因為出來的數值是 0~1 的數值,你可以根據學習狀況給予一定的 Threshold Tanh 會出現一個 -1~1 之間的數值,這樣學習可以變得更快.也可以但是對於二元分類的監督式學習不會比較好,於是也越來越多人將 Tanh 取代 Sigmoid 關於權重 (weight) 的初始化 在 NN 中,當你具有一個以上的 hidden layer 時,就必須要慎重的初始化你的起始權重 (w1) . 因為如果你的起始權重不是使用”亂數” 來作為初始化的話. 你的 hidden layer 的意義就不大,那是因為: W1 =...
繼續閱讀