Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploy to mainnet #1305

Merged
merged 43 commits into from
Jul 13, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
f8f356f
[Backend] Unify extra_info (#1267)
iamyates May 25, 2023
3d0d561
implement download csv (#1241)
iamyates May 26, 2023
80d9dc2
Merge pull request #1293 from nervosnetwork/master
Keith-CY May 26, 2023
c116e9e
Integrate websocket client into rails side (#1282)
ShiningRay May 29, 2023
4d82348
2023 04 21 sort by field new (#1295)
iamyates May 30, 2023
7853945
2023 05 04 sort nft collections by hot new (#1294)
iamyates May 31, 2023
22b9201
Issue 285 (#1296)
zmcNotafraid Jun 2, 2023
576e266
ci: update reviewers to deploy (#1299)
Keith-CY Jun 2, 2023
3ac0c32
feat: reduce ckb transactions export as csv n+1 querys (#1301)
rabbitz Jun 6, 2023
26dd2e1
fix memory leak in udt transaction count (#1302)
ShiningRay Jun 7, 2023
3b6835f
2023 05 31 support customers and developers or operator better (#1298)
iamyates Jun 7, 2023
ab05889
fix: update cota sync data (#1303)
rabbitz Jun 8, 2023
71ead97
fix: rectify the cout error in the progress bar (#1304)
rabbitz Jun 9, 2023
9a5dad8
Deploy to testnet (#1291)
github-actions[bot] Jun 9, 2023
dd0f77d
add tx_committed filter for transaction list
ShiningRay Jun 9, 2023
8b76cc7
Merge pull request #1306 from ShiningRay/fix/tx_committed_filter
ShiningRay Jun 9, 2023
e4dad7c
Optimize block query performance (#1307)
ShiningRay Jun 11, 2023
76198de
remove old deploy scripts (#1308)
ShiningRay Jun 12, 2023
7bc74ca
fix: generate block fatory timestamp (#1311)
rabbitz Jun 12, 2023
7be8d69
only check ckb node in blocksyncer, poolsyncer (#1310)
ShiningRay Jun 12, 2023
3444d6d
Extract output data into separate table (#1300)
ShiningRay Jun 12, 2023
d0d160d
fix: cell output cota info
rabbitz Jun 13, 2023
4f7b852
Merge pull request #1312 from rabbitz/hotfix/cell_output_cota_info
rabbitz Jun 13, 2023
2db7e9a
chore: resolve merge conflicts (#1314)
rabbitz Jun 16, 2023
e61586a
Fix/resolve merge conflicts (#1315)
rabbitz Jun 16, 2023
89cfe25
remove duplicated ckb transactions
ShiningRay Jun 18, 2023
7c8b7b7
Merge pull request #1316 from ShiningRay/fix/dup_pending_tx
Keith-CY Jun 18, 2023
e56a8e1
fix: block transactions sort order by id asc
rabbitz Jun 19, 2023
480824a
Merge pull request #1317 from rabbitz/fix/block_transactions_default_…
rabbitz Jun 19, 2023
cc4d375
feat: filter transactions by address hash (#1319)
rabbitz Jun 29, 2023
01c7b60
refactor: backend epoch statistics (#1313)
rabbitz Jun 29, 2023
09185f1
Optimize of cell data separation (#1318)
ShiningRay Jun 29, 2023
a994a6c
fix: resolve merge conflicts (#1323)
rabbitz Jul 3, 2023
fc50838
refactor: fix conflict with testnet
zmcNotafraid Jul 3, 2023
326eecf
Merge pull request #1324 from zmcNotafraid/fix-develop-conflict-0703
Keith-CY Jul 3, 2023
ca06d56
Merge branch 'testnet' into resolve-conflicts
Keith-CY Jul 3, 2023
6677b6c
Merge branch 'develop' into resolve-conflicts
Keith-CY Jul 3, 2023
0558b78
Update app/models/cell_output.rb
Keith-CY Jul 3, 2023
97bb617
Merge pull request #1325 from nervosnetwork/resolve-conflicts
Keith-CY Jul 3, 2023
a74fe6c
Merge pull request #1322 from nervosnetwork/develop
ShiningRay Jul 6, 2023
f3f3f40
Clean up outdated transactions (#1328)
ShiningRay Jul 10, 2023
aa1fc1c
refactor: daily statistic generator use attr logic concern (#1321)
zmcNotafraid Jul 10, 2023
85ced84
Merge pull request #1329 from nervosnetwork/develop
Keith-CY Jul 11, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ CKB_NET_MODE="mainnet"

# ckb node url
CKB_NODE_URL="http://localhost:8114"

CKB_WS_URL="http://localhost:28114"

# -------------------------------- Rails(database, redis, memcached) segment ----
# (optional if you use config/database.yml)
Expand Down Expand Up @@ -112,4 +112,4 @@ ASSET_URL=""
# (optional)
# used in Rails test environment, setting to true enables SimpleCov::Formatter::Codecov
# true | false
CI="false"
CI="false"
2 changes: 1 addition & 1 deletion .github/workflows/request-to-deploy-mainnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ jobs:
source_branch: 'testnet'
destination_branch: 'master'
pr_title: 'Deploy to mainnet'
pr_reviewer: 'ShiningRay,iamyates,zmcNotafraid,keith-cy'
pr_reviewer: 'ShiningRay,rabbitz,zmcNotafraid,keith-cy'
pr_label: 'auto-pr'
github_token: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/request-to-deploy-testnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ jobs:
source_branch: 'develop'
destination_branch: 'testnet'
pr_title: 'Deploy to testnet'
pr_reviewer: 'ShiningRay,iamyates,zmcNotafraid,keith-cy'
pr_reviewer: 'ShiningRay,rabbitz,zmcNotafraid,keith-cy'
pr_label: 'auto-pr'
github_token: ${{ secrets.GITHUB_TOKEN }}
2 changes: 2 additions & 0 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -129,3 +129,5 @@ gem "rack-cache"
gem "dalli"
gem "after_commit_everywhere"
gem "kredis"

gem "async-websocket", "~> 0.22.1", require: false
45 changes: 45 additions & 0 deletions Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,27 @@ GEM
rake (>= 10.4, < 14.0)
ansi (1.5.0)
ast (2.4.2)
async (2.3.1)
console (~> 1.10)
io-event (~> 1.1)
timers (~> 4.1)
async-http (0.59.5)
async (>= 1.25)
async-io (>= 1.28)
async-pool (>= 0.2)
protocol-http (~> 0.23)
protocol-http1 (~> 0.14.0)
protocol-http2 (~> 0.14.0)
traces (>= 0.8.0)
async-io (1.34.1)
async
async-pool (0.3.12)
async (>= 1.25)
async-websocket (0.22.1)
async-http (~> 0.54)
async-io (~> 1.23)
protocol-rack (~> 0.1)
protocol-websocket (~> 0.9.1)
awesome_print (1.9.2)
backport (1.2.0)
benchmark (0.2.1)
Expand All @@ -119,6 +140,8 @@ GEM
deep_merge (~> 1.2, >= 1.2.1)
dry-validation (~> 1.0, >= 1.0.0)
connection_pool (2.4.0)
console (1.16.2)
fiber-local
crack (0.4.5)
rexml
crass (1.0.6)
Expand Down Expand Up @@ -192,6 +215,7 @@ GEM
ffi-compiler (1.0.1)
ffi (>= 1.0.0)
rake
fiber-local (1.0.0)
fugit (1.8.1)
et-orbi (~> 1, >= 1.2.7)
raabro (~> 1.4)
Expand All @@ -211,6 +235,7 @@ GEM
http-form_data (2.3.0)
i18n (1.12.0)
concurrent-ruby (~> 1.0)
io-event (1.1.6)
jaro_winkler (1.5.4)
jbuilder (2.11.5)
actionview (>= 5.0.0)
Expand Down Expand Up @@ -271,6 +296,9 @@ GEM
net-protocol
newrelic_rpm (8.12.0)
nio4r (2.5.8)
nokogiri (1.14.3)
mini_portile2 (~> 2.8.0)
racc (~> 1.4)
nokogiri (1.14.3-arm64-darwin)
racc (~> 1.4)
nokogiri (1.14.3-x86_64-linux)
Expand All @@ -283,6 +311,19 @@ GEM
ast (~> 2.4.1)
pg (1.4.5)
pkg-config (1.5.1)
protocol-hpack (1.4.2)
protocol-http (0.24.0)
protocol-http1 (0.14.6)
protocol-http (~> 0.22)
protocol-http2 (0.14.2)
protocol-hpack (~> 1.4)
protocol-http (~> 0.18)
protocol-rack (0.2.4)
protocol-http (~> 0.23)
rack (>= 1.0)
protocol-websocket (0.9.1)
protocol-http (~> 0.2)
protocol-http1 (~> 0.2)
pry (0.14.1)
coderay (~> 1.1)
method_source (~> 1.0)
Expand Down Expand Up @@ -432,6 +473,8 @@ GEM
thor (1.2.1)
tilt (2.1.0)
timeout (0.3.0)
timers (4.3.5)
traces (0.8.0)
tzinfo (2.0.5)
concurrent-ruby (~> 1.0)
unf (0.1.4)
Expand All @@ -453,12 +496,14 @@ GEM

PLATFORMS
arm64-darwin-21
ruby
x86_64-linux

DEPENDENCIES
activerecord-import
after_commit_everywhere
annotate
async-websocket (~> 0.22.1)
awesome_print
benchmark_methods
bigdecimal
Expand Down
1 change: 1 addition & 0 deletions Procfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,4 @@ web: bundle exec puma -C config/puma.rb
worker: bundle exec sidekiq -C config/sidekiq.yml -e production
blocksyncer: bundle exec ruby lib/ckb_block_node_processor.rb
scheduler: bundle exec ruby lib/scheduler.rb
poolsyncer: bundle exec ruby lib/websocket.rb
66 changes: 59 additions & 7 deletions app/controllers/api/v1/address_transactions_controller.rb
Original file line number Diff line number Diff line change
@@ -1,24 +1,68 @@
require "csv"
module Api
module V1
class AddressTransactionsController < ApplicationController
before_action :validate_query_params
before_action :validate_pagination_params, :pagination_params
before_action :set_address_transactions, only: [:show, :download_csv]

def show
@address = Address.find_address!(params[:id])
raise Api::V1::Exceptions::AddressNotFoundError if @address.is_a?(NullAddress)
@tx_ids = AccountBook.
joins(:ckb_transaction).
where(address_id: @address.id)

params[:sort] ||= "ckb_transaction_id.desc"
order_by, asc_or_desc = params[:sort].split(".", 2)
order_by =
case order_by
when "time" then "ckb_transactions.block_timestamp"
else order_by
end

head :not_found and return unless order_by.in? %w[
ckb_transaction_id block_timestamp
ckb_transactions.block_timestamp
]

@tx_ids = @tx_ids.
order(order_by => asc_or_desc).
select("ckb_transaction_id").
page(@page).per(@page_size).fast_page

order_by = "id" if order_by == "ckb_transaction_id"
@ckb_transactions = CkbTransaction.tx_committed.where(id: @tx_ids.map(&:ckb_transaction_id)).
select(:id, :tx_hash, :block_id, :block_number, :block_timestamp, :is_cellbase, :updated_at, :capacity_involved).
order(order_by => asc_or_desc)

@tx_ids = AccountBook.where(address_id: @address.id).order("ckb_transaction_id" => :desc).select("ckb_transaction_id").page(@page).per(@page_size).fast_page
@ckb_transactions = CkbTransaction.tx_committed.where(id: @tx_ids.map(&:ckb_transaction_id)).select(:id, :tx_hash, :block_id, :block_number, :block_timestamp, :is_cellbase, :updated_at).order(id: :desc)
json =
Rails.cache.realize("#{@ckb_transactions.cache_key}/#{@address.query_address}", version: @ckb_transactions.cache_version) do
@options = FastJsonapi::PaginationMetaGenerator.new(request: request, records: @ckb_transactions, page: @page, page_size: @page_size, records_counter: @tx_ids).call
Rails.cache.realize("#{@ckb_transactions.cache_key}/#{@address.query_address}",
version: @ckb_transactions.cache_version) do
@options = FastJsonapi::PaginationMetaGenerator.new(request: request, records: @ckb_transactions,
page: @page, page_size: @page_size, records_counter: @tx_ids).call
json_result
end

render json: json
end

def download_csv
args = params.permit(:id, :start_date, :end_date, :start_number, :end_number, address_transaction: {}).
merge(address_id: @address.id)
data = ExportAddressTransactionsJob.perform_now(args.to_h)

file =
CSV.generate do |csv|
csv << [
"TXn hash", "Blockno", "UnixTimestamp", "Method", "CKB In", "CKB OUT", "TxnFee(CKB)",
"date(UTC)"
]
data.each { |row| csv << row }
end

send_data file, type: "text/csv; charset=utf-8; header=present",
disposition: "attachment;filename=ckb_transactions.csv"
end

private

def validate_query_params
Expand All @@ -38,7 +82,10 @@ def pagination_params
end

def json_result
ckb_transaction_serializer = CkbTransactionsSerializer.new(@ckb_transactions, @options.merge(params: { previews: true, address: @address }))
ckb_transaction_serializer = CkbTransactionsSerializer.new(@ckb_transactions,
@options.merge(params: {
previews: true,
address: @address }))

if QueryKeyUtils.valid_address?(params[:id])
if @address.address_hash == @address.query_address
Expand All @@ -50,6 +97,11 @@ def json_result
ckb_transaction_serializer.serialized_json
end
end

def set_address_transactions
@address = Address.find_address!(params[:id])
raise Api::V1::Exceptions::AddressNotFoundError if @address.is_a?(NullAddress)
end
end
end
end
51 changes: 32 additions & 19 deletions app/controllers/api/v1/block_transactions_controller.rb
Original file line number Diff line number Diff line change
@@ -1,27 +1,40 @@
module Api
module V1
class BlockTransactionsController < ApplicationController
before_action :validate_query_params
before_action :validate_pagination_params, :pagination_params
include Pagy::Backend
before_action :validate_query_params, :validate_pagination_params, :pagination_params

def show
block = Block.find_by!(block_hash: params[:id])
@pagy, ckb_transactions = pagy(
block.ckb_transactions
.select(:id, :tx_hash, :block_id, :block_number, :block_timestamp, :is_cellbase, :updated_at)
.where(block_timestamp: block.timestamp)
.order(:id),
items: params[:page_size] || 10,
overflow: :empty_page)

json =
Rails.cache.realize(ckb_transactions.cache_key, version: ckb_transactions.cache_version) do
records_counter = RecordCounters::BlockTransactions.new(block)
options = FastJsonapi::PaginationMetaGenerator.new(request: request, records: ckb_transactions, page: @pagy.page, page_size: @pagy.items, records_counter: records_counter).call
CkbTransactionsSerializer.new(ckb_transactions, options.merge(params: { previews: true })).serialized_json
end

render json: json
ckb_transactions = block.ckb_transactions.
select(:id, :tx_hash, :block_id, :block_number, :block_timestamp, :is_cellbase, :updated_at).
order(:id)

if params[:tx_hash].present?
ckb_transactions = ckb_transactions.where(tx_hash: params[:tx_hash])
end

if params[:address_hash].present?
address = Address.find_address!(params[:address_hash])
ckb_transactions = ckb_transactions.joins(:account_books).
where(account_books: { address_id: address.id })
end

if stale?(ckb_transactions)
expires_in 10.seconds, public: true, must_revalidate: true, stale_while_revalidate: 5.seconds
records_counter = RecordCounters::BlockTransactions.new(ckb_transactions)
ckb_transactions = ckb_transactions.page(@page).per(@page_size).fast_page
options = FastJsonapi::PaginationMetaGenerator.new(
request: request,
records: ckb_transactions,
page: @page,
page_size: @page_size,
records_counter: records_counter
).call
json = CkbTransactionsSerializer.new(ckb_transactions,
options.merge(params: { previews: true })).serialized_json

render json: json
end
rescue ActiveRecord::RecordNotFound
raise Api::V1::Exceptions::BlockTransactionsNotFoundError
end
Expand Down
61 changes: 58 additions & 3 deletions app/controllers/api/v1/blocks_controller.rb
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
require "csv"
module Api
module V1
class BlocksController < ApplicationController
Expand All @@ -6,17 +7,37 @@ class BlocksController < ApplicationController

def index
if from_home_page?
blocks = Block.recent.select(:id, :miner_hash, :number, :timestamp, :reward, :ckb_transactions_count, :live_cell_changes, :updated_at).limit(ENV["HOMEPAGE_BLOCK_RECORDS_COUNT"].to_i)
blocks = Block.recent.select(:id, :miner_hash, :number, :timestamp, :reward, :ckb_transactions_count,
:live_cell_changes, :updated_at).limit(ENV["HOMEPAGE_BLOCK_RECORDS_COUNT"].to_i)
json =
Rails.cache.realize(blocks.cache_key, version: blocks.cache_version, race_condition_ttl: 3.seconds) do
BlockListSerializer.new(blocks).serialized_json
end
else
blocks = Block.recent.select(:id, :miner_hash, :number, :timestamp, :reward, :ckb_transactions_count, :live_cell_changes, :updated_at).page(@page).per(@page_size).fast_page
blocks = Block.select(:id, :miner_hash, :number, :timestamp, :reward, :ckb_transactions_count,
:live_cell_changes, :updated_at)
params[:sort] ||= "number.desc"

order_by, asc_or_desc = params[:sort].split(".", 2)
order_by =
case order_by
when "height"
"number"
when "transactions"
"ckb_transactions_count"
else
order_by
end

head :not_found and return unless order_by.in? %w[number reward timestamp ckb_transactions_count]

blocks = blocks.order(order_by => asc_or_desc).page(@page).per(@page_size)

json =
Rails.cache.realize(blocks.cache_key, version: blocks.cache_version, race_condition_ttl: 3.seconds) do
records_counter = RecordCounters::Blocks.new
options = FastJsonapi::PaginationMetaGenerator.new(request: request, records: blocks, page: @page, page_size: @page_size, records_counter: records_counter).call
options = FastJsonapi::PaginationMetaGenerator.new(request: request, records: blocks, page: @page,
page_size: @page_size, records_counter: records_counter).call
BlockListSerializer.new(blocks, options).serialized_json
end
end
Expand All @@ -30,6 +51,40 @@ def show
render json: json_block
end

def download_csv
blocks = Block.select(:id, :miner_hash, :number, :timestamp, :reward, :ckb_transactions_count,
:live_cell_changes, :updated_at)

if params[:start_date].present?
blocks = blocks.where("timestamp >= ?",
DateTime.strptime(params[:start_date],
"%Y-%m-%d").to_time.to_i * 1000)
end
if params[:end_date].present?
blocks = blocks.where("timestamp <= ?",
DateTime.strptime(params[:end_date],
"%Y-%m-%d").to_time.to_i * 1000)
end
blocks = blocks.where("number >= ?", params[:start_number]) if params[:start_number].present?
blocks = blocks.where("number <= ?", params[:end_number]) if params[:end_number].present?

blocks = blocks.order("number desc").limit(5000)

file =
CSV.generate do |csv|
csv << ["Blockno", "Transactions", "UnixTimestamp", "Reward(CKB)", "Miner", "date(UTC)"]
blocks.find_each.with_index do |block, _index|
row = [
block.number, block.ckb_transactions_count, (block.timestamp / 1000), block.reward, block.miner_hash,
Time.at((block.timestamp / 1000).to_i).in_time_zone("UTC").strftime("%Y-%m-%d %H:%M:%S")
]
csv << row
end
end
send_data file, type: "text/csv; charset=utf-8; header=present",
disposition: "attachment;filename=blocks.csv"
end

private

def from_home_page?
Expand Down
Loading
Loading