On 11 November 2015 at 16:38, Peter Robinson <pbrobinson@xxxxxxxxx> wrote: >>>> What is the best way to proceed from here? Would anybody like to >>>> have an IRC meeting to discuss it perhaps, or to have it on the >>>> agenda at one of the weekly infrastructure meetings? Is there >>>> anybody from the infrastructure team who would like to login to >>>> the lab server where it runs now to get a feel for what is >>>> running there? >>> >>> I think it would be great to discuss at a weekly meeting. >>> >>> Are you able to make those? >> >> >> I was traveling last week, for the next couple of weeks I should be >> able to join, but not tomorrow. I am on UTC+1 (Central Europe) so >> attending earlier in the meeting is easier for me. >> >> Would you like to make this an agenda item for 19 November? >> >> We also went live with debian.org XMPP on Saturday, using Prosody. So >> far it has been successful so we could also look at how to replicate >> that on fedoraproject.org. Looks like Prosody is already in EPEL7: >> https://dl.fedoraproject.org/pub/epel/7/x86_64/repoview/prosody.html >> I've CC'd Robert (the package maintainer) and Matthew (Prosody project >> leader). >> >> The TURN server (part of the existing fedrtc.org trial) can be shared >> by XMPP users as well as serving SIP and WebRTC. > > > I don't want to interject here but I've got some queries, well > actually one query, which leads on to a bunch of questions. > > What's the current usage against fedrtc.org? > > I remember when we had "Fedora Talk", yes I am that old!, and it was > never really used and was eventually shut down because the calls a > week were in the single digits. As a contributor I don't really care > for that sort of service, and since Fedora Talk I think the demand has > declined, because I can use any number of other services for voice > communication with other Fedora people, and I so use many different > ones to communicate with various contributors all over the world > dependent on location/ISP/timezone etc. > Actually the calls were in single digits per month with a couple of them just being test calls people would do to see if the service still worked. Then we had a spammer connect to it and people getting 'called' by it. That was pretty much the end of the call system. The usual questions with a 'phone' system is: 1) Who can I talk with? 2) What do I need to use to talk with them? 3) How do I NOT get called by people? The infrastructure issues are: 1) What are the resource needs of the service (disk, cpu, network) 2) What the authentication/authorization methods of the service (the asterisk used plain text passwords and while people were not supposed to use their fedoraproject one most did) If the fedrtc.org system only connects Fedora people to Fedora people it may not be too useful... if it would be more useful to consolidate it with the Debian one as that would allow multiple floss to work with each other. [And sorry if this is something that is obvious.. the fedrtc.org timed out when I tried to get to it the first time and the second time it kind of gave me some of the page.] > The concern I have, as an onlooker, that knows the load of the infra > team because of the services they provide they're already snowed under > but there's all sorts of implications, from resilience to security to > maintenance, that a service such at this demands of the infra team, > but I've seen no actual hard statistics for the actual usage that > fedrtc.org gets to justify such an investment. I personally think > that's needed and it can be presented on list if you can't make a > meeting, might even be better for you to outline > statistics/demands/requirements here first to save time. > > Peter > _______________________________________________ > infrastructure mailing list > infrastructure@xxxxxxxxxxxxxxxxxxxxxxx > http://lists.fedoraproject.org/admin/lists/infrastructure@xxxxxxxxxxxxxxxxxxxxxxx -- Stephen J Smoogen. _______________________________________________ infrastructure mailing list infrastructure@xxxxxxxxxxxxxxxxxxxxxxx http://lists.fedoraproject.org/admin/lists/infrastructure@xxxxxxxxxxxxxxxxxxxxxxx