-
Notifications
You must be signed in to change notification settings - Fork 632
Bridging Centrality Plugin #200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hi, thanks for the pull request. You should actually base your branch on our |
Sorry, I'm a kind of lost.
I do not know where I get lost, so I no sure about how to fix it. Can you
help me?
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em ter, 13 de nov de 2018 às 09:09, Eduardo Ramos <notifications@github.com>
escreveu:
… Hi, thanks for the pull request.
You should actually base your branch on our master branch, and create a
pull request into our master-forge branch.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUNncDEjMmXprHLnCW_vCVKyq4bLjks5uuqhXgaJpZM4YbaLf>
.
|
Sorry, I was confused, I misread the pull request target being our |
Thank you Ramos,
By the way, this is one of the products of a master thesis from a student
of mine. Can you advise me about a journal that could accept such kind of
software tool publication?
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em ter, 13 de nov de 2018 às 09:58, Eduardo Ramos <notifications@github.com>
escreveu:
… Sorry, I was confused, I misread the pull request target being our master.
Your request is correct.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUI66dGA6sNr75206m4q14UVMd22zks5uurPKgaJpZM4YbaLf>
.
|
Hi, |
Thank you. I think is better to analyze some biological data proving the
value of the plugin. I will do that.
Cheers.
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em ter, 13 de nov de 2018 às 10:07, Eduardo Ramos <notifications@github.com>
escreveu:
… Hi,
I don't know much about publications, but you can check some existing
Gephi citations: https://gephi.org/users/publications/
https://scholar.google.fr/scholar?cites=692893633449340225
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUPfD18npcJJ3w_aZwp3UEKSIglJBks5uurYFgaJpZM4YbaLf>
.
|
Dear Eduardo,
I built this plugin to manage graphs with more than four thousand nodes and
many hundreds of thousand edges, and It's working, Considering this
magnitude order, I defined my variables as double since probably the
relevant information will be at least ten digits at right, I have not a
single variable defined as float. Unfortunately, even calling the method as
...
nodeTable.addColumn(BRIDGING_CENTRALITY, "Bridging Centrality",
*Double.class*, (*double*) 0);
Even so, I just can read a precision of at most six digits in the Data
Laboratory tab, and the majority of the data are listing just zeros
(0.000000), and a few lines shows something different of zeros.
So, If I didn't anything wrong I believe this is a limitation of your
primary classes. Please, Can you help me?
Regards,
Anderson Santos
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em ter, 13 de nov de 2018 às 09:58, Eduardo Ramos <notifications@github.com>
escreveu:
… Sorry, I was confused, I misread the pull request target being our master.
Your request is correct.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUI66dGA6sNr75206m4q14UVMd22zks5uurPKgaJpZM4YbaLf>
.
|
Mmm I think data laboratory limits precision, can you try with BigDecimal.class? Anyway, on double click it should show the full value. |
I did not know such tip (double-click). Indeed, the full precision is shown
with this action, so I don't think in varying the number will change de
data laboratory precision. The worst happens when we export the data, the
number 0.00011 is showing as 1.1034059555675141E-4 after the double-click,
but after export to CSV, it is just 0.00011. The point here is the user
must double-click a lot of cells to have a clue about the data
distribution, for instance, to select the relevant top 20 nodes that are
not exactly zero. Please, fix this.
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 17:10, Eduardo Ramos <notifications@github.com>
escreveu:
… Mmm I think data laboratory limits precision, can you try with
BigDecimal.class? Anyway, on double click it should show the full value.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUOPbeLaWGqDWx5Tp2sSl3HmLh5ulks5uvGqbgaJpZM4YbaLf>
.
|
I guess we should increase the shown precision to 5 or 6 decimals, and show full precision with bigdecimal, but that will need to wait for a next release. Anyway, I advise you to use BigDecimal if you want some real precision, as floats and doubles have inherent precision loss, that's why we round it. |
Thank you for your answer, Eduardo. I will make the change for BigDecimal
and will be looking forward to the next release.
BTW, If such release will happen before 2019, March I will be pleased
because this is my planning date to release a publication citing GEPHI as
the tool used in the methods.
Cheers,
Anderson Santos
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 17:31, Eduardo Ramos <notifications@github.com>
escreveu:
… I guess we should increase the shown precision to 5 or 6 decimals, and
show full precision with bigdecimal, but that will need to wait for a next
release.
Anyway, I advise you to use BigDecimal if you want some real precision, as
floats and doubles have inherent precision loss, that's why we round it.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUOz6u2LzmGIDVyi_3U05wucIj70dks5uvG-igaJpZM4YbaLf>
.
|
Eduardo,
I thought you were guessing about the BigDecimal. Sorry.
Yes, it did the trick.
[image: Captura de tela de 2018-11-14 20-43-39.png]
The selected node is the last one. The above-listed data is a small sample
with just one hundred nodes, and I already have a lot of zeros.
I will push this modification. However, the CSV export still has the
problem.
BTW, I need to make a new pull request for you be able to incorporate the
changes?
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 17:37, Anderson Santos <santosardr@gmail.com>
escreveu:
… Thank you for your answer, Eduardo. I will make the change for BigDecimal
and will be looking forward to the next release.
BTW, If such release will happen before 2019, March I will be pleased
because this is my planning date to release a publication citing GEPHI as
the tool used in the methods.
Cheers,
Anderson Santos
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936> <http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 17:31, Eduardo Ramos <
***@***.***> escreveu:
> I guess we should increase the shown precision to 5 or 6 decimals, and
> show full precision with bigdecimal, but that will need to wait for a next
> release.
>
> Anyway, I advise you to use BigDecimal if you want some real precision,
> as floats and doubles have inherent precision loss, that's why we round it.
>
> —
> You are receiving this because you authored the thread.
> Reply to this email directly, view it on GitHub
> <#200 (comment)>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/Aq5aUOz6u2LzmGIDVyi_3U05wucIj70dks5uvG-igaJpZM4YbaLf>
> .
>
|
Erratum: the sent image does not show the last node but believe me, it's
very small.
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 20:45, Anderson Santos <santosardr@gmail.com>
escreveu:
… Eduardo,
I thought you were guessing about the BigDecimal. Sorry.
Yes, it did the trick.
[image: Captura de tela de 2018-11-14 20-43-39.png]
The selected node is the last one. The above-listed data is a small sample
with just one hundred nodes, and I already have a lot of zeros.
I will push this modification. However, the CSV export still has the
problem.
BTW, I need to make a new pull request for you be able to incorporate the
changes?
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936> <http://www.researchgate.net/profile/Anderson_Santos3/>
Em qua, 14 de nov de 2018 às 17:37, Anderson Santos ***@***.***>
escreveu:
> Thank you for your answer, Eduardo. I will make the change for BigDecimal
> and will be looking forward to the next release.
> BTW, If such release will happen before 2019, March I will be pleased
> because this is my planning date to release a publication citing GEPHI as
> the tool used in the methods.
>
> Cheers,
> Anderson Santos
>
> Anderson Rodrigues dos Santos - Professor Adjunto
> Universidade Federal de Uberlândia – FACOM – sala 1B120
> <http://lattes.cnpq.br/3752226356973936> <http://www.researchgate.net/profile/Anderson_Santos3/>
>
>
>
> Em qua, 14 de nov de 2018 às 17:31, Eduardo Ramos <
> ***@***.***> escreveu:
>
>> I guess we should increase the shown precision to 5 or 6 decimals, and
>> show full precision with bigdecimal, but that will need to wait for a next
>> release.
>>
>> Anyway, I advise you to use BigDecimal if you want some real precision,
>> as floats and doubles have inherent precision loss, that's why we round it.
>>
>> —
>> You are receiving this because you authored the thread.
>> Reply to this email directly, view it on GitHub
>> <#200 (comment)>,
>> or mute the thread
>> <https://github.com/notifications/unsubscribe-auth/Aq5aUOz6u2LzmGIDVyi_3U05wucIj70dks5uvG-igaJpZM4YbaLf>
>> .
>>
>
|
…g the problem of limited numerical precision
…us the column with the same name created by this plugin; plotting in the VERTICAL orientation; dividing the domain into ten tips according to the maximum value
Hi Anderson, I made some small corrections to your code, so please be sure to merge the following pull request into your codebase so you can later update the plugin: santosardr#1 |
Hi, |
Your plugin is now available at https://gephi.org/plugins and in Gephi - Tools - Plugins menu |
Thank you for the excellent news Eduardo.
Best regards,
Andeson Santos
Anderson Rodrigues dos Santos - Professor Adjunto
Universidade Federal de Uberlândia – FACOM – sala 1B120
<http://lattes.cnpq.br/3752226356973936>
<http://www.researchgate.net/profile/Anderson_Santos3/>
Em sáb, 5 de jan de 2019 às 11:34, Eduardo Ramos <notifications@github.com>
escreveu:
… Your plugin is now available at https://gephi.org/plugins and in Gephi -
Tools - Plugins menu
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#200 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Aq5aUKSrnEvdgbMzo96qCVlLViILcaJMks5vAKnggaJpZM4YbaLf>
.
|
Fix pull request problems gephi#200
Bridging Centrality (BC) depicts nodes connected to immediate neighbors that, by turn, are connecting clusters of nodes. BC is the result of Betweenness Centrality multiplied by Bridging Node Centrality. BC can be useful to identify links for groups of nodes. For instance, considering a protein network, BC could point up proteins acting as a bridge between complementary molecular functions. This plugin appends the new data columns to the Gephi data laboratory tab: Betweenness, Bridging Node and Bridging Centrality.